Tableau Conference 2018 Keynote: Natural Language generation for the next generation of BI

Author avatar

Robert Palmer

    Tableau kicked off their 2018 conference yesterday with the usual levels of enthusiasm and fanfare that we’ve seen over the years.

    This year is different however, with Microsoft taking steps forward to improve their offering and Thoughtspot quickly rising through the ranks with its very different approach to business intelligence, it was time for Tableau to show that all those business purchases of the past year, had made a difference. The question is, will Tableau re-establish itself as the market leader in BI and will it make sure that it doesn’t get left behind?

    Opening with the tagline “we love data”, Tableau acknowledged that sometimes not everyone understands the value that BI can bring, or even how it is best deployed. They also acknowledged that that data can be an emotive topic.

    This built up to the point that the next big evolution in visual analytics, was something so significant that it was likened to the invention of the refrigerator or the birth of the internet. Following on from last year’s underwhelming announcements around Hyper and Tableau Prep (let’s face it only us developers got excited about those), this needed to not disappoint again.

    So, did Tableau hit the mark or did they present another damp squib?

    Tableau Prep

    First up, Tableau prep. For those that haven’t seen it this tool was first announced as part of last year’s keynote. The tool was created to make it easier for those from non-technical backgrounds to be able to prepare data, with a simple drag and drop interface that we’ve all come to know and love from Tableau. If you’ve ever heard of Alteryx then think of it as being very similar, just with a lot less functionality and no ability to schedule data refreshes.

    Here at Edit we’ve been using this technology for ad-hoc analysis since its release, and I can safely say it’s incredibly impressive. Tasks that SQL would struggle with and take several hours, or even days, run in just a few hours or even minutes. So, what does the future hold for this tool?

    A few small UI enhancements have been announced that make data prep and identifying the source of columns much easier. To do this, Tableau have extended the data rolls function and have added custom data rolls to the options, to automatically fix common issues with existing rolls such as typos in URLs. The interface will also suggest ways to fix any issues it finds with the data. All very clever developments, that will save many hours lost due to issues not being spotted in millions of records right at the start of analysis.

    The second announcement was the ability to call R and Python scripts from within the flow. This has the potential of taking the fight squarely to Alteryx’s door. By allowing the calling of R functions from within Tableau prep, there is further potential for BI to be integrated with the work undertaken by data science teams. This, in turn will allow teams to clearly present businesses with the full benefit that these practices can offer. Tableau didn’t share any constraints around this integration, so it remains to be seen if it will deliver on its full potential.

    The last big announcement was Prep conductor. This allows users to deploy Tableau prep packages to a Tableau server and schedule them so that data can be refreshed on a regular basis. Now this is the functionality that the Tableau community has been crying out for on the forums, and one that everyone agrees should have been in place when Prep was launched. This announcement turns Tableau Prep from being suitable for one off data analysis, to something that can be used in a production system. This will be included in the 2019.1 release early in the New Year. Unfortunately, Tableau have decided that this will be an additional license above and beyond the existing Tableau server and desktop licenses that users hold.

    Personally, I think this could be a dangerous move. If it’s not priced right, then I can see a massive backlash from the community. Going on the forums several Tableau customers aren’t using Tableau Prep simply because they have no way of scheduling these extracts. By charging for this feature, they risk alienating part of their key user base. If the price is low enough to justify the costs to a business, then this may not be an issue, but it will be interesting to see what the response is when this launches.

    Data Models

    Next up, data models. Most of the functionality announced in this section basically reduces the effort taken to get the data model setup in Tableau. It covered things such as using system tables in a RDBMS to understand the relationships that exist, and to identify the join criteria between tables based on the query that Tableau is generating. All in all, very useful, but only really a benefit for Tableau developers.

    The big feature release was automatic level of detail calculations (LOD). In the current version of Tableau users must remember that values can be repeated when linking two tables together. For example, if we linked an item and basket to do some analysis then we must remember that when working with basket totals will be repeated for every single item that has been included.  Failure to do so could mean that we we have an incorrect grand total for sales in a given month for example. In the current world, we can get around this using a LOD calculation to work out the maximum total per basket and then working with that. If a user isn’t familiar with the data model, then they may not know that they need to keep an eye out for things like this.

    In future versions of Tableau, this isn’t going to be a worry. Tableau will know the context of the question and instead automatically add this LOD calculation in the background. Ultimately helping to ensure that human error is less likely to creep into analysis.

    Natural Language Querying (NLQ)

    Tableau saved the best for last, and the feature that the whole keynote was based around.

    Ask Data.

    This feature will mean that users can type a question and Tableau will automatically interpret what it is that the user is trying to ask. It will then present them with the answer using what it feels is the best presentation method.

    It will be interesting to see how this compares with Microsoft’s and Thoughtspot’s offerings of a similar kind. In my opinion, this could work much better than Microsoft’s current offer, and as it is integrated into Tableau it avoids Thoughtspot’s requirement of having to purchase dedicated hardware. To top it off, it doesn’t need any additional licenses above the Tableau licenses that you already have.

    Tableau have touted that this doesn’t require any setup effort. However, I suspect that you will need to do some work to get the best out of this. It’s rarely the case that database fields are named in a way that makes it obvious for users to understand the data. We often find that we need to add in those synonyms to make sure that end users are able to get the right answer when they ask an ambiguous question.

    This function entered beta yesterday and based on Tableau’s typical release cycles we could potentially see this drop in a final release around March/April next year.

    For marketers, this has the potential to remove the need to learn a new element of software and it could save the training costs that are normally associated with this. Instead, they can simply ask questions such as “What’s happened to my CTO rate over the last 12 months?” and Tableau will provide the answers. Allowing practitioners to answer the questions posted by their marketing directors far faster than they can do currently.

    What wasn’t announced:

    So that covers all the key announcements that came out of yesterday’s presentation, but Tableau snuck a few more surprises out in their coming soon page (https://www.tableau.com/products/coming-soon).

    Those of note include:

    • A redesigned mobile app that will be available for both iOS and Android. This one will provide interactive offline previews, and will allow users to get even more out of Tableau on that crowded tube journey or mobile blackspot
    • Nested sorting improvements: Currently when you use nested sorting in Tableau, it doesn’t quite work as you expect. Instead to get this to work as it should, there is a lot of work investment from a user – soon this will be a thing of the past.
    • Export to PowerPoint: Historically sending those story points/dashboards out to execs hasn’t been easy. Tableau will soon have an export to PowerPoint, which means that your exec dashboard can be built in Tableau directly. Commentary can then be added a third party natural language generation tool. This can be enhanced by your marketing teams and then distributed to all execs that don’t have Tableau access or kept for prosperity.
    • Google Adwords connector: For those that use Google Adwords, this update finally marks the delivery of a native way to analyse your metrics from within Tableau. No longer will you have to go out to different software or find a workaround to make this happen – instead all campaign reporting can be done in one place.
    • Web authoring improvements: Tableau has taken another big step to bring server and desktop together. The big feature is that you can now cancel queries that are running longer than you’d expect, meaning that you are no longer stuck waiting for something to finish.

    All in all, have Tableau delivered another damp squib? I guess it depends on your use of the software. For those that have been crying out for Prep conductor, the fact that it’s an additional license cost may well impact user adoption of this feature.

    For those that build dashboards and establish data models, the introduction of many time saving features are going to make the bugs that creep in less likely, although I do worry that this could lead to a reduction in data understanding. For all those Tableau users out there, that don’t understand all the nuances and intricacies of their tableau data sources, automatic LOD just saved them a huge amount of work.

    One thing is for sure, by offering Ask Data free of charge for existing users and without dedicated hardware, Tableau has just changed the rules. No more do we need dedicated hardware to provide this function, or to train individuals up so that they can serve content to the rest of the business. We’re now in a world that anyone with a Tableau license can answer a business question by simply typing the question into a browser. This is when data democratization has the potential to become a reality.

    Bath

    +44 (0) 1225 480 480

    20 Manvers Street

    Bath

    BA1 1JW

    Leeds

    +44 (0) 113 260 4010

    2nd floor, 2180 Century Way,

    Thorpe Park,

    Leeds, LS15 8ZB.

    London

    +44 (0) 113 260 4010

    5th Floor, Cordy House,

    91 Curtain Road

    London, EC2A 3BS

    Part of the Kin and Carta plc

    • By pressing submit you consent for us to process your data in order to answer your query, according to our Privacy Policy.

    • This field is for validation purposes and should be left unchanged.

    © 2018 Edit. Kin and Carta plc. Company reg. no. 3624881, All rights reserved. VAT Registered GB 927458295 Privacy Policy | Terms & Conditions | Cookie Policy