Data & AI trends for 2021

Last week, our Co-Founder & Data Science Director, Eric Topham, hosted the latest instalment of The BI Report, exploring Data & Analytics trends and trajectories in 2021.

The panel included Mark Balakenende, Senior Director of Product Marketing at Matillion, Dedy Kredo, VP of Customer-Facing Data Science at Explorium, and William McKnight, President of McKnight Consulting Group. 

They covered a range of topics from the importance of effective data management, infrastructure and visualisation, to the increasing importance of security and control of data and models through privacy technologies.

The panel started with the evolution of data analytics and how data and AI will transform human-machine relationships and influence decision making. They also explored what’s to come in 2021 whilst we embrace the reality of at least remote part time working for the foreseeable future, and the increasing need to access more data that you may normally only access in the office.

Some of the key considerations and insights from the panel included:

  1. The growing importance of external and alt-data (non-traditional) data, in both, using it to enhance insights but also finding it and using it meaningfully.
  2. The increase in graph technologies, database and capabilities to drive decision making for businesses
  3. The importance of data catalogues to navigate the available data and secure data entering the business and controlling whether the data is useful and effective for applications, and how it can be used.

Gartner has also recently released their Top 10 Data and Analytics Trends for 2021 from XOps to Analytics at the edge, you can read more here.

But what our top predictions?

After the session, I sat down with Eric to discuss where we at T-DAB.AI are placing our bets for next year:

  • Data Management
  • Commoditisation of machine learning algorithms
  • Federated learning
  • Data Marketplaces
  • Smarter & Faster AI

 

Data management

Across the themes of discussion from the panel and industry lessons is the importance of effective data management to drive quality insights. Ensuring companies get the basics right is a key theme throughout the following predictions as quality data input is always key to deliver valued output.

Accelerated migrations to the cloud through digital transformation programmes and the increasing need to rely on a wider contextual data set to better manage business decisions is driving X analytics techniques.

Much of the data available for customer analytics has been massively disrupted from the impact of Covid but those that responded quickly during the pandemic were rewarded. Teams leveraging X analytics techniques are able to access and utilise their data more effectively because of solid foundations from automation, as well as by focusing on using simpler algorithms with smaller data sets to deliver actionable insights. This latter trend is also a key lesson in our guide on How to Scale AI.

And of course, these can be presented effectively through evolving dashboarding and BI tools and improve the data narrative.

“Data quality: Having better data is still the answer”

 

Commoditisation of machine learning algorithms

Naturally, as with any technology advances, we’re reaching the point where fine-tuning algorithms is bringing less return for some organisations. Where companies were once extracting more value than others by tweaking open-source models, they are no longer able to be as competitive. We are reaching the point of diminishing returns through the commoditisation of common algorithms and many being open source.

This is being further aided with increasing access through the likes of Azure Synapse Analytics, where the likes of our team at T-DAB.AI can serve ready to consume pipelines to our clients, accelerating time to insight. You no longer need a hardcore data scientist in-house, just get the basics right and consume the output.

This progression is key when you consider how skills, knowledge and capability rank in Gartner’s report on Barrier to AI implementation, and Complexity of AI solution(s) integration with existing infrastructure also ranks 2nd. It’s another key driver of why we developed our guide on How to Scale AI, recognising that teams need to build with what they have first to scale meaningfully.

Barrier to AI implementation

But as a clear leader, and often driven by an increasing need to access more data is ‘Security or privacy concerns’.

Federated learning

There is an array of Machine learning & AI driven security applications in the market that are able to rapidly scan zetabytes of data within minutes, and many companies now have security offices, practices or regulations being implemented to manage the use and access of data across the business.

In addition, the introduction of privacy enhancing technologies to provide trusted access to external parties and protect data and algorithms from adversarial attacks can help combat that data privacy is flagged as top barrier to AI implementation.

Federated learning however is enabling us to move that one step further by utilising decentralised systems to build and train a model without sharing data, negating critical issues of privacy, security or access rights.

Basic models can be deployed to user devices and systems, which then train using the local data. These models then periodically feedback their learnings, where the suggestions are aggregated and used to improve the model before being redeployed to the edge.

Data & Model Marketplaces

Where the importance of good quality data to augment data sets and provide actionable insights is key, which is most of the time, the advent of marketplaces bring easy data access and commoditised models to industry.

During the panel, Dedy from Explorium reported that 78% of US data leaders consider external data very valuable. However, most struggle to find the data they need easily. Only 7% reported low effort required with 46% exerting medium effort and 47% exerting high effort. Therefore, key considerations of data access need to be made.

Marketplaces will enable organisations to monetise their data, access trained models, access other trusted data sources, and drive new revenues.

Better AI

There is still room for improvement. Whilst the increasing access to data and commoditisation of machine learning algorithms can improve and accelerate development time, these applications or implementations are often seen as weak or narrow AI.

Strong AI solutions can be delivered by combining several machine learning models to deliver insights throughout the value chain and introducing cutting edge algorithms such as deep learning language model, GTP-3.

However, the pace of change is rapid. Last year, we surpassed the Google algorithm, and 2 weeks later, with the commitment of resource and a much larger body, they were back in front [read more].

Organisations are going to rely on AI for automation, information, decision making and driving business analysis.

With marketplaces, federated learning and the commodisation of machine learning models, combining services to deliver strong AI applications is going to become more of a reality and organisations need to be ready to make their move.

Do you want to find out more?

TO FIND OUT MORE ABOUT THE PROJECT & OUR SERVICES, GET IN TOUCH WITH THE TEAM.