Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

What goes into a Bona-Fide Cradle to Grave Business Transformation Platform?
Latest   Machine Learning   Newsletter

What goes into a Bona-Fide Cradle to Grave Business Transformation Platform?

Last Updated on July 24, 2023 by Editorial Team

Author(s): Cassidy Hilton

Originally published on Towards AI.

Technology

What goes into a Bona-Fide Cradle to Grave Business Transformation Platform?

Foreword

Over the years, I’ve developed a deep appreciation and passion for world-class SaaS technology. When leveraged effectively with an emphasis on business impact, usability and unequivocal, absolute raw utility, properly formulated platforms often are the difference between mediocre and massive advancement of the business. Marketing and industry hype aside, does the technology possess the ability to impact comprehensive, definitive change? Will it play a tangible role in the ever-evolving pursuit of digitization and modernization for a given organization?

Yes, the overall strategy which is developed for any digital transformation project reigns paramount to any technology, which is simply a component of the said strategy, but the comprehensiveness and extensibility of the technology, as a key component of the overall modernization strategy, simply matter. Why? …Incremental Interoperability. Or said another way, the degree to which a collective organization can operate in a uniform and consistent manner while using the technology will ultimately dictate its overall impact. Particularly for the practitioners and affected day to day users who demand fluidity in their processes, utility in their tasks, and accuracy/consistency in the information they use day in and day out.

The bigger picture boils down to three core matters as it pertains to the criticality of leveraging a comprehensive, fluid platform: Wasted time & resources via the sheer number of applications on the market today. Second, Stunted Digital Maturity due to the perpetual β€œcycle” of developing a comprehensive data/digital environment which allows for analysis with a full picture. Lastly, the lack of innovation due to low-impact daily initiatives driven by disparate systems, disjointed workflows, and maintenance-laden operations.

Application Overload

The number of applications available for each and every micro-level task is at an all-time high, as shown in the below infographic published by chiefmartec.com. According to martech.com, the number of marketing technology platforms has grown by 13.6% since last year, up to a total of 8,000 platforms (applications). As such, it stands to reason that any environment with more moving parts becomes more complex, disorganized, and costly. In terms of complexity, piecing together multiple systems or applications is nearly impossible. The inability to effectively create a fluid solution workflow results in restricted information, siloed processes, and limited information integrity. Further, deploying and maintaining multiple, disparate applications is flat out costly. X number of applications requires Y integrations, maintenance schedules, upgrades, and individualized resources to maintain.

[9]

Limited Digital Maturity

Disparate systems and data environments often induce negative, ripple-effect results. Remember the β€œmore moving parts, more complexities” comment above? It’s important here as it pertains to employees’ abilities to perform their jobs well, in an optimal fashion day in and day out. When data disparity exists across an organization multiple employees and departments are repeatedly pulling the same information and performing the same day-to-day preparatory tasks which ultimately limits their ability to perform more meaningful work in a collective fashion. Informatica says it well; β€œThe key to minimizing repetitive work is finding a way to easily reuse your logic on the next data set, rather than starting from square one each time.” [3]

β€œThe key to minimizing repetitive work is finding a way to easily reuse your logic on the next data set, rather than starting from square one each time.” β€” Informatica

I refer to the above as β€œDigital Economies of Scale”. Note a previous article where I’ve spoken at length about this topic. Essentially, leveraging past work in order to drive incremental quality and speed of future work. Further, the struggle is real as it pertains to any businesses’ ability to get their arms around all the information that they have across the organization. According to Informatica, 80 percent of any analytics initiative is focused on simply collecting information in order to establish a complete picture for analysis[1]. The problem, however, is when information is distributed disparately across departments and subunits the required effort to bring it all together is astronomical. Doubling down on this challenge, there’s new, oftentimes more relevant information coming available every minute of every day. The β€œData Never Sleeps” series does a tremendous job of articulating this very thing. Establishing a fluid environment in order for the complete, timely analysis to be performed on a regular, ongoing basis is a perpetual challenge that all organizations continually battle and disparate systems only compound the problem.

Lastly, the risk of making inaccurate decisions based on bad or limited data is one that organizations face every day. The reality is, even the best-kept application environments still pose data integrity risk, it’s just the nature of structured, tabular data which is all collected in various fashions. Unfortunately, bad decisions due to inadequate data cost businesses $15 million on average in 2017, according to Gartner’s Data Quality Market Survey[2]. Inadequate data, as noted above transpires due to disparate systems and processes across the organization, and has a compounding effect in terms of perpetual efforts to maintain a disjointed infrastructure.

Lack of Innovation

A direct result of the aforementioned β€œstuck in the cycle” effect of mundane, low-impact day to day work is an organization’s inability to focus on more meaningful, innovative pursuits. Essentially, the perpetual endeavor of developing a fully comprehensive data foundation due to disparate workflows and applications hinders an organization’s ability to up-level the quality of their day to day pursuits, period.

66 percent of IT budgets are allocated to performing current status quo tasks, and not on digital-focused innovation. -2018 IDG Economic Outlook Survey

According to the 2018 IDG Economic Outlook Survey, 66 percent of IT budgets are allocated to performing current status quo tasks, and not on digital-focused innovation. [4] Rather than leveraging deep data analysis to develop increasingly optimized workflows and a more innovative approach to macro and micro-level initiatives, system and application disparity promotes inefficient, disjointed processes and low-impact, maintenance-laden operations.

There are many niche players out there, some of which are very effective and tremendously successful in specific capacities and circumstances. However, the modern technologies which promote operational fluidity and allow for an innovative day to day work by way of β€œDigital Economies of Scale” are hard to come by, but literally, change the competitive landscape for organizations who use them. To this end, the SaaS one-percenters who hold their own in the enterprise space as a critical component of the foundation and fabric of organizations’ day to day operations merit recognition. One in particular: Domo.

Platform Overview

You may or may not have heard of the platform. What’s more, however, is you’re likely unaware of the comprehensiveness of the technology. Domo isn’t just a BI tool, it’s a β€œcradle to grave” business transformation platform. Domo was built from the ground up to serve as the solution for digital transformation aspirations that most modern companies have today. The platform originated in the cloud to support enterprise-grade requirements and scale. Further, the fluidity and comprehensiveness of the technology allow for the rapid development of digital solutions in a multifaceted set of product capabilities.

Yes, Domo plays in competitive waters with the likes of Tableau, Qlik, Looker among many others, but also goes toe to toe with all the major ETL providers, assimilates fluidly into all the major cloud providers’ (AWS, GCP, Azure) workflows, and also promotes extensibility across IoT, Custom Applications, Predictive Analytics, and native AI and Data Profiling. Even further, the user experience from A to Z is completely fluid and promotes native, organic collaboration across the entire suite of offerings as a core component of the platform’s user experience. Can you say Enterprise-Grade Interoperability?

The platform, in its entirety, spans seven core competencies, which have certainly evolved over the company’s lifecycle, but in their truest form have stayed the same. What I’m getting at here is the great Josh James has had the vision for an end to end, enterprise-grade SaaS analytics platform for longer than most who are reading this have even been in the industry, let alone understood the genius of the vision circa-2010.

When I talk about fluidity and comprehensiveness, I don’t mean β€œbe everything to everyone” I mean aligning the core functionality of the platform to a modern organization’s typical workflow or approach to macro and micro level digital transformation and modernization. Not easily accomplished, and Domo does it better than any other platform on the planet.

Connect

Anyone worth their salt in this space knows the outputs don’t just happen magically. By outputs, I mean insights, visuals, predictions, applications, etc. While the outputs are where the glory is, gaining access to any and all data sources in order for intelligent outputs that allow for a full story and comprehensive understanding of a given subject is critical. This is where Domo is literally heads and tails above the rest. They’ve hung their hat on the ability to easily bring in data, no matter where it resides, via hundreds of full scale, plug and play connectors whether in the cloud, on-premise or in files via their impressive connector library. Because of their foresight on the requisite upstream requirements of a robust digital workflow, outputs that impact real change at scale are merely a natural outcome of the user development experience.

Domo allows for any and all integration strategies to be achieved. It simply depends on what is best for the organization. At a high level, the connector library of 1000+ pre-built API integrations allows for quick input of credentials (in most cases) and replication of core systems data into Domo’s cloud. Further, their federated integration capabilities allow for direct rendering of visualizations via source systems like Athena, Redshift, Azure, PostgreSQL, etc. for the organizations that prefer to maintain full jurisdiction of their data and where it resides. They even have a recently optimized development for cloud-native federated data warehouses, starting with Snowflake. Additionally, the β€œworkbench” application allows for on-premise integrations by way of basic local configuration in order for secure data transit. With all the aforementioned means for developing comprehensive, sound integrations across a multitude of data sources, achieving a comprehensive informational (data) environment that is conducive to analysis which depicts comprehensive stories and ongoing innovative tasks is easily achievable with Domo.

Store

In addition to the broad out of the box integration capabilities, Adrenaline, Domo’s intelligence data warehouse is a critical component of the platform and truly is the secret sauce. Adrenaline is the engine that drives the enterprise-grade computational capabilities and literal sub-second query performance as part of the interactive user experience. Adrenaline handles solutions that are powered by billions of records with ease. Those of you who’ve crashed a jupyter notebook or had machines that are about to take flight trying to handle monstrous local flat files know how invaluable a fluid, performant development environment is when you’re simply trying to access data in order to do your thing.

A key capability here is the bi-directional connect/store capabilities of the platform, specifically the integrations. The most digitally mature organizations are ones who are monitoring outcomes and decisions, maintaining data remediation and transformation steps, recording feedback, and measuring impact overtime via recursive data loops. The bi-directional capabilities of their write-back technology facilitate data loops wherever they’re needed. The bottom line on Domo’s data storage and accessibility capabilities is this; they’ll go head to head with all the major integrations providers and will prove to be superior more often than you’d ever imagined, period.

Prepare

Contrary to popular belief among the hype, particularly those with aspirations of leveraging AI as a part of their digital transformation strategies (more on this later), data preparation is where analysts, engineers, and scientists alike spend the majority of their time. What’s more, is this step is often performed in a silo (cue up the spreadsheets, jupyter notebooks, views, and one-off tools) with little visibility to impacted users and near impossible repeatability of this invaluable work in the end. It’s always baffled me that data prep, even as we sit here in the year 2020 continues to be one that is a β€œstart from scratch” endeavor time and time again.

According to Anaconda’s annual survey, Data scientists spend about 45% of their time on data preparation tasks, including loading and cleaning data. Even further, an additional 19% of their time is spent loading data (the β€œL” in ETL).

Imagine a world where the preparatory steps around a given dataset or multiple datasets were saved, annotated, and made available to the broader organization in order for a real-world β€œeconomies of scale” effect to take place. It happens in Domo natively.

β€œData scientists spend about 45% of their time on data preparation, and an additional 19% of the time on loading data.” β€” Anaconda [5]

Fusion is Domo’s data preparation and transformation engine which facilitates a consistent, multifaceted data environment, powered by multitudes of sources integrated into Domo to be analyzed, profiled, cleansed, enriched, melded, and prepared for use, all in one common workflow, sans any outside providers, technologies or services.

Domo’s magic ETL enables all users to cleanse and combine data from anywhere with no coding required. Also, more advanced developers can leverage either a native MySQL or even Redshift if SQL is your code of choice. Additionally, there are native Python and R capabilities as well. And it cannot be overstated enough.. all of these modules coexist in one common, cloud-based environment where they are made available and accessible to other users in a highly collaborative and transparent environment.

Visualize

Domo’s visualization engine is powered by their Explorer technology. This is a best-in-class analytics capability that doesn’t just include highly intuitive, gorgeous visualization development. In recent years, Domo has added guided storytelling, data exploration & profiling which is all mobile-friendly and has been since Domo’s inception. Coupled with highly-performant computing capabilities via Adrenaline, the visualization experience both from a developer and consumer perspective is seamless, practical, and in many ways frankly, sublime.

Collaborate

Buzz is Domo’s collaboration engine which has been a key component of the platform since day one. Buzz allows for collaboration on individual visualizations and across collective dashboards. What’s more is Buzz allows for annotations and tagging on specific data points in visualizations, which drives engagement, and also emphasizes a deep, fundamental understanding of data with the convergence of both quantitative and qualitative insights.

Ever been on a project where you’re deep in the weeds and wonder why you’re seeing specific behavior or trends, or maybe just need clarification on field names, and wish you could simply β€œ@” the SME within the environment in which you’re already operating? Yeah, you and millions of other developers. Again, not only does Domo support this scenario, it literally is easily developed muscle memory even for novice users.

Predict

The Predict stage of the digital workflow is powered by Domo’s Mr. Roboto. Mr. Roboto is the AI-driven intelligence layer of the platform which iteratively analyzes incoming data in order to detect trends, deviations, anomalies, correlations, and to optimize queries. Mr. Roboto not only provides AI-driven insights generation, but it also drives optimized business workflows via full-fledged, macro-level comprehension of the business intelligence environment completely. With the combination of capabilities like anomaly and correlation detection, user consumption patterns, and recurring focus points, Mr. Roboto intelligently cultivates insights and anomalies in order for optimized business functions and to intelligently predict likely business performance shortcomings.

Further, automated alerts based on changes in trends or behaviors in the data, and the aforementioned AI-driven insights, which can be set based on a number of pre-defined factors allow for a truly achievable β€œdata-driven enterprise”, as they say (minus the hype and overpromise).

The AI practitioners in the room may be scratching their heads at this point, as the above AI-driven analysis and profiling, while true and impressive, may not have been what you were expecting exactly as the subject matter for this section titled Predict, in all of its glory. Have no fear my fellow nerds (relax with the nerd description, you’re in good company), Domo also has native machine learning capabilities as well. The functionality is a natural extension of the products’ data preparation workflow in the magic ETL module. The machine learning capabilities include Classification, Clustering, Forecasting, Multivariate Outliers, Outlier Detection, and Regression. The ease of use in taking raw data from just about anywhere imaginable and in a series of fluid steps, training a model, applying predictions to future or unknown records, and deploying these outputs in an enterprise-grade, productionized manner is a beautiful thing. It’s imperative to note, however, that few organizations today are seeing success with AI. In 2019, organizations invested $28.5 billion into machine learning systems development according to Statistica [6]. Yet, according to the IDC, only 35% of organizations report successfully deploying analytical models into production [7].

Only 35% of organizations report successfully deploying analytical models into production. -IDC

More often than not, ML endeavors are simply science projects as they’re performed by small teams or individuals, and are often done locally with many degrees of separation between the data scientist and the software/technology teams. The latter being the ones needed to successfully deploy models into production in most organizations. Disparate processes and platforms continue to hinder the aspirations of AI-driven businesses everywhere.

β€œNo machine learning model is valuable unless it’s deployed to production.” β€” Luigi Patruno [8]

So what goes into doing this the right way? What do we know about that 35% of organizations who’ve successfully deployed analytical models into production? I’ll be brief here, as I am writing a separate article on this topic, which i’ll publish later, but in order for machine learning models to provide maximum impact and value to an organization, the predictions need to be deployed into the native processes and workflows of an organization. The last thing boots on the ground individuals need is to be asked to follow a cadence or process outside of their typical day-to-day.

In order for this to happen successfully, the degree to which the developers can recapitulate a typical software engineering development and deployment cycle from end to end is key. Remember the bi-directional capabilities of the integrations that we discussed in the Connect/Store sections? That functionality is often overlooked but is critical when it comes time to deploy models. Why? The ability to write predictions back to the source systems for which employees use to drive their day-to-day actions allows for the natural assimilation of predictions into their preexisting workflows. What’s more is, from an engineering perspective, the fact that the bi-directional capabilities already exist in Domo, and require minimal effort from the engineering teams is a huge deal, just ask them.

As mentioned above, the forthcoming data science deep-dive article will outline the different types of machine learning approaches that Domo supports and the technology behind it. What’s important to take away from this article is that Domo has full AI/Machine Learning capabilities built into the platform as a natural component of a typical analytics workflow. Whether it’s AI-driven data profiling and ongoing analysis or additive ML to an existing analytics solution, Domo has it covered in a very fluid manner.

Extend

Domo’s app store is the typical software or machine learning engineers’ playground. The developers portal is where many organizations and partners spend most of their time to further develop even tighter, more synergistic solutions usability, integrations, and interrelationships across their respective businesses. At a high level, the app store consists of API’s, partner, and customer developed apps, dashboards, connectors, algorithms, and premium solutions. There are many impressive applications that have been developed via Domo’s SDK and Design Studio.

Source: https://www.domo.com/platform/build

If, on the off chance Domo doesn’t have the native functionality needed whether it’s an API integration to automate a data pipeline, rendering data into a mobile application via their SDK, or a fully customized dashboard solution, the developers portal is a comprehensive extension and ultimate completion of a fully exhaustive cradle to grave platform.

Bottom Line

The number of SaaS analytics providers is at an all-time high as we sit here in the year 2020. The number of providers who service specific components of the typical analytical workflow is also at an all-time high (e.g. API connections, ETL, visualization, AI, etc.). The ramifications of so many technologies often include disparate processes, workflows, and systems or platforms and squandered or misdirected day to day work. Melding the proper stack of technologies and business strategies together in order for the optimized day to day processes, comprehensive data pipelines, and interconnected organizational communication practices is near impossible.

However, platforms whose core functionality directly aligns with a typical digital (analytics) workflow from end to end are the ones who are driving incremental monetizing impact across organizations. The reason is workflows remain undisturbed for the most part, and in many cases are naturally redirected in a more optimized manner due to the core functionality of the platform. A great example of squandered efforts here is data preparation. How often have you seen the same preparatory steps being performed on separate projects over and over again? Or reporting? How often have you seen an executive ask for his/her own one-off report because the existing dashboard is outdated or there’s a follow-on question that typically ends up in a nearly full recreation of what you’ve already done? Platforms that allow for β€œpicking up where we left off” in a straightforward way as it pertains to the next, inevitable request are the platforms of the future.

Further, the providers which organically bring employees together, in a collaborative and transparent fashion are those who experience adoption and usage in droves. How often have you seen the engineering and data science teams come together in order to develop a deployment strategy for a predictive model which encourages model transparency and assimilation into pre-existing workflows? It doesn’t happen very often. And hopefully, the reasons are somewhat obvious at this point in the read. However, a platform that cuts out the minutia and tactical requirements, and simply requires engagement or awareness is a refreshing reality for organizations who are striving for better-aligned teams, more optimized workflows, and higher-impact day to day work.

The degree to which the SaaS platforms core functionality embodies sound, effective digital practices, and workflows directly impacts the daily macro and micro-level efforts of every organization. Particularly ones who strive to modernize, transform, and innovate. A full-stack, comprehensive, and fluidly interconnected platform like Domo not only emulates sound practices from end to end, they’ve had a pulse on this very transformative ideal for many years. To that end, Domo has proven over its lifetime to continually provide the requisite foresight which all organizations strive to find in their providers, and personally, I look forward to the future evolution of the Domo platform.

Wallpaper Downloads

https://bit.ly/3mqAse6 U+007C https://bit.ly/35NzguD
https://bit.ly/37PNbmn U+007C https://bit.ly/3mqA4fE

Sources:

1- https://blogs.informatica.com/2020/07/16/3-rewarding-ways-finance-can-use-data-to-drive-success/

2- https://www.gartner.com/smarterwithgartner/how-to-stop-data-quality-undermining-your-business/

3- https://www.aunalytics.com/4-ways-disparate-data-sets-are-holding-you-back/

4- https://www.idg.com/tools-for-marketers/2018-cio-tech-poll-economic-outlook/

5- https://www.datanami.com/2020/07/06/data-prep-still-dominates-data-scientists-time-survey-finds/#:~:text=Data%20scientists%20spend%20about%2045,data%20scientists%20conducted%20by%20Anaconda.&text=In%20some%20surveys%20in%20the,of%20a%20data%20scientist's%20time

6- https://www.statista.com/chart/17966/worldwide-artificial-intelligence-funding/

7- https://medium.com/r?url=https%3A%2F%2Fwww.mesaonline.org%2F2019%2F09%2F27%2Fidc-ai-can-significantly-help-organizations-with-analytics-business-intelligence%2F

8- https://mlinproduction.com/deploying-machine-learning-models/

9- https://chiefmartec.com/2020/04/marketing-technology-landscape-2020-martech-5000/

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓