Skip to main content

5 takeaways from the Gartner Data & Analytics Summit 2018

Sean R. Coombs

The presenter walks on stage in a dimly lit arena, stands beneath a bright, purple-hued spotlight, and delivers opening remarks using words like “artificial intelligence” and “predictive analytics.” Suddenly, the ears of more than 3,500 data professionals stand at attention, and the audience has a collective thought: “This is how we’re finally going to use our data to do something innovative.” Because that’s what the Gartner Data & Analytics Summit 2018 is all about: coming together to share experiences with our data programs, to tell stories of failure and of success, and to learn best practices from other experts in our field.

In this post, I wanted to share my key takeaways with you, so you too can benefit from this collective experience.

  1. Your data and analytics program faces four key challenges.
    It’s no secret that data is a valuable resource for business intelligence, which is why data analytics is a growing priority for organizations looking to become more data-driven. According to our own research, 46 percent of U.S. organizations say that they are planning data analytics initiatives within the next twelve months—a finding that has increased from 24 percent just one year prior. Data analytics can provide organizations an opportunity to improve efficiency, optimize customer experiences, and gain an edge over their competition. In fact, according to Gartner’s recent CIO survey, business intelligence (BI) is seen as a top enabler of competitive advantages.

    Yet, receiving value from your data and analytics program often means overcoming four critical challenges: trust, diversity, complexity, and literacy. The first challenge, establishing trust, is a top priority for organizations looking to use their data for analytics. At the heart of building confidence in your data is to improve your data quality and to verify the information in your systems. Yet, as the volumes of data continue to grow, you need a way to verify data at scale. Gartner posits that crowdsourcing and automating metadata creation is a way to establish trusted data.

    The next challenge, promoting a culture of diversity, can be seen in many ways. It is in your practices around hiring talent of varying demographics, reducing bias in algorithms, and opening up your organization to more diverse data sources (think IoT data). According to Gartner, more diverse teams actually perform better than homogenous ones, which is why it’s so important to focus on the culture around diversity. When it comes to diversifying your data, Gartner suggests that there’s a way to be agile while maintaining enterprise scalability. They call this a Bimodal approach. In this instance, organizations can integrate Mode 1 production data with Mode 2 innovation data to diversify their analytics program in a scalable way.

    Building confidence in your data and promoting diversity only add to the next challenge: mastering the complexity of running a digital business. More and more, organizations find that the centralized approach to analytics, once heralded as the gold standard, is not meeting their needs. That’s why Gartner says that organizations need to empower multiple small teams to leverage more precise data and analytics platforms that provide more content, more understanding, and more timely responses.

    The last challenge has to do with building the data literacy of your workforce. As businesses strive to become more data-driven, they will need to establish a common language and culture around data. Gartner says you can do this by providing training in context, creating a certification system for data professionals, and by leveraging augmented analytics.

  2. Data-driven organizations flip the traditional strategy paradigm.
    A lot of organizations today claim to be data-driven, but what they really mean to say is that they use data to address business challenges. For instance, the classic approach to data strategy begins with a change in business, process, or systems, and data and analytics are used to solve for these changes. To that end, even if there was a contradiction between the data and the user’s gut feelings, Gartner finds that 90 percent of business decision-makers would override the data.

    The difference between this approach and a data-driven approach is that, in these examples, data is inherently subservient to business needs rather than defining the needs of the business. According to Gartner, data-driven organizations flip this model and base their business priorities on what the data is telling them. For example, a data and analytics strategy is used to drive business model innovations, customer experience innovations, and to transform decision-making and process automation. The key here is that these innovations are driven proactively by the data, rather than as reactive solutions to a challenge that has come up.

  3. The needs around data quality tools and practices are changing.
    Technological advancements and consumer expectations are accelerating the pace of change for data quality tools and practices. These demand- and supply-driven changes can be linked to audience, governance, diversity, latency, analytics, intelligence, deployment, and pricing. The use cases for traditional data quality tools are data migrations, data integrations, and operational and transactional data. Yet, emerging use cases for data quality tools are around big data and analytics, information governance, and master data management. 

  4. Artificial Intelligence is gaining traction
    Organizations that are able to invest in and deploy artificial intelligence (AI) capabilities will see a distinct advantage in the years to come. Through 2022, Gartner believes that AI will be a major battleground for technology leadership, and by 2021, AI augmentation will generate 2.9 trillion in business value and recover 6.2 billion hours of worker productivity. Despite the huge gains to be had, we have a long way to go before we can reap benefits like these. While only 4% of organizations today say they have already invested in artificial technology and have deployed said technology, a large percentage of companies (46%) are in the process of planning their AI deployments in the near- and far-term. Another 35% of organizations have AI on their radar, but they are not actively planning their deployment. Lastly, only 14% of companies say that they are not interested in AI technologies.

  5. Data preparation tasks are slowing innovation; enter AI.
    In data science, there are four steps to deploying a successful model: business goal setting, data preparation, model development, and model production and servicing. While the first step, goal setting, tends to focus on helping the business understand how they can use advanced analytics, the next step (data preparation) is where a majority of time is spent. According to Ryohei Fujimaki, Research Fellow at NEC, 80 percent of project time is spent on collecting, moving, wrangling, and engineering data to create a basic schema. And, furthermore, nearly 90 percent of the quality of that effort is largely based on the skills of individual data scientists, meaning that there is little consistency from person to person.

    How did NEC solve this? They built an AI program to automate the manual (and often boring) task of doing data preparation. In doing so, they have reduced the time spent on data preparation from 30 days to one day, enabling data scientists to spend their time developing and refining their models for betting decisions. While this is just one example, NEC has proven that the technology holds great promise for automating time-consuming tasks, improving employee productivity, and ultimately enhancing the quality of the data off of which business-critical decisions are being made.

After four days of content-packed, standing-room-only sessions, attendees left the Gartner Data & Analytics Summit 2018 with a greater sense of the trends and challenges that lie ahead for the broader data community. While many data professionals are looking to AI and machine learning as the next frontier of data use cases, it goes without saying that the quality of data used to feed these programs is critical to their success.

Want to know more? Learn how we're are helping organizations like yours to start trusting their data for larger projects.

Get started