Now Reading
In Conversation with Deepak Ghodke, Country Manager, India at Tableau

In Conversation with Deepak Ghodke, Country Manager, India at Tableau

Deepak Ghodke


Deepak Ghodke
Deepak Ghodke, Country Manager, India, Tableau

It’s largely credited for bringing data visualization into the mainstream. Seattle-based Tableau Software is  now the new gold standard for Analytics, Visualization and BI space. The company recently unveiled their product roadmap for next three years during the Keynote at TC 2016 featuring next-gen innovations – in-memory data engine with Hyper, Project Maestro, self-service data prep tool among other smart recommendations and collaborations. In a candid chat with Analytics India Magazine, Deepak Ghodke, Country Manager, India, Tableau talks about some of the major developments afoot in Tableau — why self-service analytics is the next big thing in BI and why the BI market in India is estimated to touch a whopping $213.8 million in 2017.

Read more to find out about the latest trends in Big Data and Analytics that will emerge in 2017.



[dropcap size=”2″]AIM [/dropcap] Analytics India Magazine: Could you tell us about your role in India and the focus areas? 

[dropcap size=”2″]DG[/dropcap] Deepak Ghodke: I am the Country Manager for Tableau in India and I am responsible for customer success and driving the growth of Tableau in India.

Asia-Pacific is Tableau’s fastest growing region and it has been a very exciting and immensely satisfying journey in India for us till date. The response since our launch in India two years back has been tremendous from our customers across industries. We feel rewarded to be an enabler for organisations to build an analytic culture.

An increasing number of customers and prospects are interested in adopting our offerings on a term or subscription basis. There is also growing appeal for our enterprise and OEM licensing models. These are the trends we will continue to encourage and embrace in the future.

[quote]Some of our clients in India are Marico, Infosys, Wipro, Ashok Leyland, Eveready, Blueocean Technologies, Star Health and Allied Insurance, CRIF High Mark and eClerx etc.[/quote]

 

AIM: Your recently unveiled product roadmap at the Tableau conference was received with a lot of fanfare. Tell us about the product innovations at Tableau which will be rolled out in 2017?

DG: Our mission is to help people see and understand their data, because we know data can empower people to achieve great things.

To enable our customers to do their best work, they would need an analytics platform that allows them to make the most of all their data in their organisation. This platform should answer deeper questions and scale as usage increases all while keeping data secure. And that’s where Tableau come in.

As we shared during the keynote at TC16, every part of our product roadmap is designed to empower our customers and their entire organisation to make better decisions faster with data. Here is what we have planned for the next three years.

1)  A new data engine for faster analysis – Hyper: As the volume of data grows exponentially, user expectations are also growing. Our customers need immediate responses to their questions. To solve this, we’re building a new in-memory data engine with Hyper, the fast database technology we acquired earlier this year.

Hyper enables fast analysis on billions of records and near real-time data updates. It’s designed to simultaneously process transactional and analytical queries without compromising performance. Hyper will also enhance Tableau’s hybrid data model. Our customers still be able to connect live to over 50 different sources that Tableau supports such as Amazon Redshift, Google BigQuery, Snowflake, and Microsoft SQL Server, or choose to bring some or all of your data into Tableau with Hyper. The beta for Hyper will start in early 2017.

2) Project Maestro: It is  a self-service data-preparation tool. We know that getting data ready for analysis is a time-consuming and difficult process. And what we’ve heard from our customers is that there is an extended set of data-prep activities that our data stewards perform to support others in their organisations. That’s why we’re excited to announce a brand new data-prep product codenamed Project Maestro.

3) New data governance capabilities: Enterprises will soon govern their self-service analytics environment at scale with new functionality in Tableau Server to certify data sources, easily conduct impact analysis on sources and workbooks, promote content and write workflows with simple drag and drop gestures. These capabilities will be available in 2017 and beyond.

4) Next leap in analytics: Tableau continues to invest in making analytics easier for everyone. For example, Natural Language Processing will bring new ways to interact with data through human language. Tableau is also adding instant analytics, a new capability that will automatically provide contextual information as users interact with their data to help them find insights faster. These features will progressively available throughout 2017 and beyond.

5) Tableau Server for Linux: Tableau revealed a version of Tableau Server for the Linux platform. Now users of the open-source operating system, including governments of all levels, educational institutions and businesses of every size, will be able to leverage the power of self-service analytics. Tableau for Linux will be available in 2017.

6) New collaboration capabilities: Tableau showed upcoming capabilities to help users collaborate with each other and monitor the metrics they care about, enabling self-service at scale. Data-driven alerting will make it easier for people to stay on top of their data and be notified when key metrics exceed a specific threshold. Customers will be able to collaborate and discuss insights directly within an analysis to drive better business outcomes starting in 2017.

7) Smart recommendations: Tableau demonstrated future plans to add a new machine learning recommendations engine to its platform. Smart Data Discovery and Smart Recommendations are a key trend of the future and we are already getting ready to create this enabler for our customers. Algorithms will surface recommendations for workbooks and data sources that are trusted, highly used and contextually relevant to the individual workflow. Recommendations will be available in 2017.

8) New hybrid data connectivity for the cloud: Tableau showcased a new live query agent that will act as a secure tunnel to on-premises data. Data behind the firewall is easier to access and analyze with Tableau Online, Tableau’s SaaS managed service. This will be available in 2017.

 

AIM: How is Project Maestro, the data prep product going to give your business a significant competitive edge?

DG: We know that getting data ready for analysis is a time-consuming and difficult process. And what we’ve heard from our customers is that there is an extended set of data-prep activities that our data stewards perform to support others in their organisations. Most people still do data prep in Excel and data modelling is an expensive, rigid, procedure and may be too hard to implement. That’s why we’re excited to announce a brand new data-prep product codenamed Project Maestro.

[pullquote]Project Maestro will make it possible for more people, from IT to business users, to easily prep their data with a direct and visual approach. You’ll instantly see the impact of the joins, unions, and calculations you’ve made, ensuring that you have exactly what you need before jumping into analysis.[/pullquote]

Project Maestro will also integrate with the rest of the Tableau platform, letting our customers quickly publish your data to Tableau Online or Tableau Server, or analyse it in Tableau Desktop.

 

AIM: You have spoken extensively about self-service analytics. What is self-service analytics and how does it enable democratization of data (you have championed the term “democratization of data”)?

DG: The new thing in the two-decade old domain of business intelligence is self-service analytics.

This approach will enable all users to answer their own question and this will continue to be one of the fastest-moving areas in the enterprise. Since the techniques people use to drive adoption and get value from their data are multiplying, it is also leading to an increased demand for self-service analytics. This has been witnessed amongst many major companies using big data.

Companies have started to prefer tools that can be used in-house rather than hiring an external agency to carry out the job for them. These in-house tools are meant to be used by end users in a self-service mode. This increase in users has led Gartner to estimate that the business intelligence market will reach $213.8 million in 2017 which is an 18.6% increase over the 2015 spending.

India is currently among the top 10 big data analytics markets in the world. By 2025, the big data analytics sector in India is expected to grow eight folds to $16 billion accordingly to a recent report by NASSCOM.

In fact, Linkedin has named statistical analysis and data mining as the second hottest skill that can get an individual hired in 2016 globally. It is the only skill that has been consistently ranked top 4 in 10 countries analysed, suggesting that businesses are still hiring experts aggressively in data storage, retrieval and analysis.

Tableau is on a mission to help users see and understand their data.

To accomplish this mission, our fundamental belief is in the democratization of data, meaning “the people who know the data should be the ones empowered to ask questions of the data.” Everyday knowledge workers should have the ability to easily access their data wherever it may reside. These same knowledge workers should also have the ability to analyze and discover insights about their data without assistance from the elite few – the data scientists and IT developers.

[pullquote align=”left”]Visualizing data is important regardless of the size of the data because it translates information into insight and action. The approach to visualizing Big Data is especially important because the cost of storing, preparing and querying data is much higher. Therefore, organizations must leverage well-architected data sources and rigorously apply best practices to allow knowledge workers to query Big Data directly.[/pullquote]

We have recently launched Tableau 10 to reflect our commitment in making it easier and faster for people to work with data. One of the most important themes of Tableau 10 is to ensure that self-service analytics can be furthered for all kinds of users. Hence, this updated version has a fresh new design which makes it easy for users to grasp the insights from their data. This also includes new analytical and mobile enhancements, options for preparing data and a host of new enterprise capabilities.

 

AIM: The burning question doing the rounds is the recent leadership shake-up at Tableau? Is Tableau going to be Amazon-ized with the new CEO Adam Selipsky at the helm? Your thoughts.

DG: Adam has been at AWS (Amazon Web Services) from its inception and been part of the leaders team where he helped grow AWS from a start-up into a multi-billion dollar business and establish it as the undisputed market leader in cloud computing. He has tremendous experience working with enterprises on cloud infrastructure – something that will surely benefit Tableau’s customers and one of the reasons we selected him. Tableau has been and will be committed to delivering cloud solutions for our customers who are moving in that direction.

For Adam, Tableau represents a rare opportunity of well-loved customer-centric products, incredibly talented people, a dedication to technology innovation, and an unrivaled momentum in the market. Adam sees Tableau is positioned to be a new standard in the world and he has the skills and right mindset to help build that next chapter of growth.

 

See Also

AIM: What is Tableau’s strategy in the growing Business Analytics space?

DG: Data analytics has become one of the game changing technologies of our time and we are witnessing a continuous increase in its adoption across industries. Companies will continue to embrace products that make it easy for them to manage and analyse growing amounts of data, stored in a number of different places.

At Tableau, we will continue our mission to help people see and understand their data. In fact, we have recently launched Tableau 10 and 10.1 to reflect our commitment in making it easier and faster for people to work with data. One of the most important themes of Tableau 10 is to ensure that self-service analytics can be furthered for all kinds of users. Hence, this updated version has a fresh new design which makes it easy for users to grasp the insights from their data. This also includes new analytical and mobile enhancements, options for preparing data and a host of new enterprise capabilities.

 

AIM: Tableau has gained widespread prominence for bringing data visualization into the mainstream. It has also been at the forefront of R&D. Can you tell us about R&D efforts in the field of statistical graphics or predictive analytics?

DG: We are more focused than ever on Research and Development – in fact we just announced that we have nearly 900 employees at Tableau that are focused on it. We are still focused on empowering everyone in the organisation, and helping to provide a secure and scalable environment that is easy to manage and deploy.

For example, in 2017, the interface to data will start to feel even more natural, thanks in part to improvements in areas like natural language processing and generation. A new addition to the BI toolbox, natural language interfaces can make data, charts, and dashboards even more accessible by letting people interact with data using natural text and language. Though there is healthy skepticism surrounding this new field, it will be an exciting space to watch.

 

AIM: How is Tableau placed in the competitive landscape vis-à-vis rivals Spotfire and QlikView and legacy tech companies such as Microsoft and Amazon?

One of the fun things about our industry is seeing analytics grow up in all kinds of places. We are very successful in deals where prospects evaluate multiple products. Our win rates are strong against all competitors.

We are incredibly well-positioned to serve our customers because:

  • Our enterprise capabilities have grown, and more and more customers are deploying the Tableau platform at scale.
  • Tableau’s ease of use is transformative. Our product quality and user experience are unmatched. We care about every detail. We beta test with tens of thousands of customers. We employ scientists and experts to craft an analytical experience that is beautiful and powerful. This is important because it enables our customers to answer questions at the speed of thought, and to stay in the flow of the analytical thinking process.
  •  Tableau is the “Switzerland of data.” We have an open and flexible platform that lets people connect to 50+ native data sources and unlimited web data sources.
  • It empowers end-users to do analytics without programming, and also helps IT move out of the report-factory role into an enabler of analytics for the entire organization.
  • Tableau’s flexibility and choices means it adapts to existing IT infrastructure and is easy to deploy and maintain either on-premise, or in the cloud.
  • Tableau has an amazing community of customers and data enthusiasts, who create data heroes in every organisation.

 

AIM: Does Tableau as a company pay more emphasis on competitor focus or customer retention?

DG: Enabling customers to do, that is precisely Tableau’s mission. For almost 14 years, Tableau has been helping organizations of all sizes to make sense of their data in order to make better decisions faster.

[quote]Our every decision is aligned with our customers’ needs. Tableau already has a strong customer-oriented culture, with a large and devoted base of customers ranging from startups to non-profits to government institutions to global enterprise businesses and we are working to build additional mechanisms to ensure customers are always first in all of our minds.[/quote]

 

AIM: Last word, how is the market for BI and analytics in India shaping up?

DG: There has been an uptick in the adoption rate of big data and analytics in India but many of the companies who decide to use these technologies don’t really have clarity on the kind of results they intend to get out of it. So there is also the need to create more awareness and educating enterprises about the power of big data and the competitive advantage that can be gained by harnessing analytics in the right way.

There’s also a strong demand for big data analytics professionals across verticals in the Indian market. Our India entry around 3 years ago was in response to market demands in the region, and the growing business opportunity it offers. India forms one of the most crucial markets for us currently with respect to future potential. Gartner has estimated the business intelligence market to reach $213.8 million in 2017 which is an 18.6% increase over the 2015 spending. By 2025, the big data analytics sector in India is expected to grow eight folds to $16 billion as per a recent report by NASSCOM, so the demand for skilled professionals in this domain is only going to grow exponentially going forward.


Enjoyed this story? Join our Telegram group. And be part of an engaging community.


Provide your comments below

comments

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Scroll To Top