« Back

Real-Time with AI – The Convergence of Big Data and AI

The Convergence of Big Data and AI Big data is moving to a new stage of maturity — one that promises even greater business impact and industry disruption over the course of the next few years. As big data initiatives mature, organizations are now combining the agility of big data processes with the scale of artificial intelligence (AI) capabilities to accelerate the delivery of business value.

The convergence of big data with AI has emerged as the single most important development that is shaping the future of how firms drive business value from their data and analytics capabilities. The availability of greater volumes and sources of data is, for the first time, enabling capabilities in AI and machine learning that remained dormant for decades due to lack of data availability, limited sample sizes, and an inability to analyze massive amounts of data in milliseconds. Digital capabilities have moved data from batch to real-time, online, always-available access.

The ability to process large volumes of data with agility is leading to a rapid evolution in the application of AI and machine-learning applications. Whereas statisticians and early data scientists were often limited to working with “sample” sets of data, big data has enabled data scientists to access and work with massive sets of data without restriction. Rather than relying on representative data samples, data scientists can now rely on the data itself, in all of its granularity, nuance, and detail. For example, imagine the difference in fraud detection systems that can now use 24 months of transaction history instead of 6 months. This is why many organizations have moved from a hypothesis-based approach to a “data first” approach.

Making AI real-time to meet mission-critical system demands put a new spin on your architecture. To deliver AI-based applications that will scale as your data grows takes a new approach where the data doesn’t become the bottleneck. We all know that the deeper the data the better the results and the lower the risk. However, doing thousands of computations on big data requires new data structures and messaging to be used together to deliver real-time AI. During this session will look at real reference architectures and review the new techniques that were needed to make AI Real-Time.


Webinar Presenter

Colin MacNaughton

Colin MacNaughton is the Head of Engineering at Neeve Research. He comes from a long background in high performance messaging and enterprise middleware. He holds patents on fault-tolerant computing techniques and business process modeling, has served in standards working groups such as the AMQP working group and is a committer in open source messaging projects at Apache. He has architected enterprise-grade middleware solutions across many verticals both domestically and internationally.

He joined Neeve in 2012 with the belief that the X Platform’s approach to handling both an applications message streams and data (state) streams using memory oriented computing techniques is the key to taking high-performance computing to next level of performance, reliability, agility, and ease.

Webinar Moderator

Subu Sankara, VP Software Services, Synerzip.