Microsoft Cloud Computing Scale Holographic Storage Project

Cloud Computing is eating the IT world and 125 zettabytes of data will be generated annually by 2024. Microsoft Research Cambridge and Microsoft Azure are developing new cloud-first optical storage technologies. For a few years, Microsoft Project Silica has been developing an optical storage technology that uses glass storage media. Glass media would be used …

Read more

Beyond Big Data is Big Memory Computing for 100X Speed

MemVerge™, the inventor of Memory Machine™ software, today introduced what’s next for in-memory computing: Big Memory Computing. This new category is sparking a revolution in data center architecture where all applications will run in memory. Until now, in-memory computing has been restricted to a select range of workloads due to the limited capacity and volatility …

Read more

Quantexa Uses Context-Aware Artificial Intelligence to Uncover Human Trafficking Networks

Quantexa has developed revolutionary Entity Resolution and Network Analytics technology that enable deeper insights into complex financial and real-world criminal problems like human trafficking. Quantexa is able to use their technology to model and achieve a true understanding of groups and people and their relationship and their behaviors so that they can achieve genuine business …

Read more

The Next Big Data Debacle Could Involve Amazon and China

Data is now essentially the lifeblood of today’s modern industries. Not only does it provide insights for effective decision-making but it can also be used to predict and influence consumer and user behavior. Companies like Amazon and Netflix, for instance, use big data and analytics to better understand their customers in order to create products …

Read more

Does the real estate data industry need to decentralize?

The availability of up-to-date and accurate data is a fundamental requirement for the vast majority of industry sectors. Whether it’s Facebook exploring where we like to eat or major credit institutions attempting to ascertain our financial health – data plays a substantial role in the world of business-related decision making. However, one such industry that …

Read more

China plan to win AI with lots of money, data and easy regulations

China wants to integrate four areas for stronger AI. China will use abundant data, hungry entrepreneurs, many AI scientists, and AI-friendly policy. 29 U.S. states have enacted their own laws regulating autonomous vehicles. And governors in 10 states have issued executive orders curbing testing and use. In 2018, China adopted national self-driving car guidelines that …

Read more

2018’s Hottest Tech Stock Is A Life Saver

Everyone’s heard the story about the government’s promise to spend $1 trillion on fixing America’s aging critical infrastructure. But there’s another big money story that few investors know of… Trump’s trillion-dollar pledge won’t come close to fixing our infrastructure. It’ll take $3.6 trillion to make this happen. And in the midst of this infrastructure crisis, …

Read more

Commercial AI will start sorting through ISIS intelligence data within 6 months

Within six months the US military will start using commercial AI algorithms to sort through its masses of intelligence data on the Islamic State. “We will put an algorithm into a combat zone before the end of this calendar year, and the only way to do that is with commercial partners,” said Col. Drew Cukor. …

Read more

General Fusion will work with Microsoft to mine and analyze 100 terabytes of experimental data

Canada nuclear fusion energy startup General Fusion is working with Microsoft to analyse its fusion energy experimental results using cloud-based big data techniques. General Fusion will work with Microsoft’s Developer Experience Team to build a new, cutting-edge computational platform that will enable General Fusion to mine over 100 terabytes of data from the records of …

Read more

Google probably has over ten million servers

Google data centers process an average of 40 million searches per second, resulting in 3.5 billion searches per day and 1.2 trillion searches per year. Google was spending $20 billion per year on its data centers in 2014 and now is spending about $30 billion per year. Google used 5.7 terawatt-hours of electricity Google consumed …

Read more

DARPA wants fast data encoding and processing of big data using molecules

DARPA has announced its Molecular Informatics program, which seeks a new paradigm for data storage, retrieval, and processing. Instead of relying on the binary digital logic of computers based on the Von Neumann architecture, Molecular Informatics aims to investigate and exploit the wide range of structural characteristics and properties of molecules to encode and manipulate …

Read more