[ad_1]
This can be a collaborative submit between AtScale and Databricks. We thank Kieran O’Driscoll, Expertise Alliances Supervisor, AtScale, for his contributions.
Kyle Hale, Answer Architect with Databricks, coined the time period “Semantic Lakehouse” in his weblog a couple of months again. It’s a good overview of the potential to simplify the BI stack and leverage the facility of the lakehouse. As AtScale and Databricks collaborate increasingly on supporting our joint clients, the potential for leveraging AtScale’s semantic layer platform with Databricks to quickly create a Semantic Lakehouse has taken form. A semantic lakehouse offers an abstraction layer on the bodily tables and offers a business-friendly view of information consumption by defining and organizing the info by totally different topic areas, and defining the entities, attributes and joins. All of this simplifies the info consumption by enterprise analysts and finish customers.
Most enterprises nonetheless battle with information democratization
Making information obtainable to decision-makers is a problem that the majority organizations face at the moment. The bigger the group, the tougher it turns into to impose a single commonplace for consuming and making ready analytics. Over half of enterprises report utilizing three or extra BI instruments, with over a 3rd utilizing 4 or extra. On prime of BI customers, information scientists have their very own vary of preferences as do software builders.
These instruments work in several methods and communicate totally different question languages. Conflicting analytics outputs are nearly assured when a number of enterprise items make choices by resorting to totally different siloed information copies or typical OLAP cubing options like Tableau Hyper Extracts, Energy BI Premium Imports, or Microsoft SQL Server Evaluation Providers (SSAS) for Excel customers.
Protecting information in several information marts and information warehouses, extracts in numerous databases and externally cached information in reporting instruments would not give a single model of reality for the enterprise and will increase information motion, ETL, safety and complexity. It turns into an information governance nightmare and it additionally implies that the organizations are operating their companies on probably stale information from totally different information silos within the BI layers and never leveraging the total energy of the Databricks Lakehouse.
The necessity for a common semantic layer
The AtScale semantic layer sits between all of your analytics consumption instruments and your Databricks Lakehouse. By abstracting the bodily type and site of information, the semantic layer makes information saved within the Delta Lake evaluation prepared and simply consumable by the enterprise customers’ instrument of selection. Consumption instruments can connect with AtScale by way of one of many following protocols:
- For SQL, the AtScale engine seems as a Hive SQL warehouse.
- For MDX or DAX, AtScale seems as a SQL Server Evaluation Providers (SSAS) dice.
- For REST or Python functions, AtScale seems as an online service.
Slightly than processing information domestically, AtScale pushes inbound queries all the way down to Databricks as optimized SQL. Because of this customers’ queries run immediately in opposition to Delta Lake utilizing Databricks SQL for compute, scale, and efficiency.

The additional advantage of utilizing a Common Semantic Layer is that AtScale’s autonomous efficiency optimization know-how identifies consumer question patterns to mechanically orchestrate the creation and upkeep of aggregates, similar to the info engineering crew would do. Now nobody has to spend the event effort and time to create and keep these aggregates, as they’re auto-created and managed by Atscale for optimum efficiency. These aggregates are created within the Delta Lake as bodily Delta Tables and may be considered a “Diamond Layer”. These aggregates are absolutely managed by AtScale and enhance the dimensions and efficiency of your BI Stories on the Databricks Lakehouse whereas radically simplifying analytics information pipelines and related information engineering.
Making a tool-agnostic semantic lakehouse
The imaginative and prescient of the Databricks Lakehouse Platform is a single unified platform to assist all of your information, analytics and AI workloads. Kyle’s description of the “Semantic Lakehouse” is a pleasant mannequin for a simplified BI stack.
AtScale extends this concept of a Semantic Lakehouse by supporting BI workloads and AI/ML use instances by our tool-agnostic Semantic Layer. The mix of AtScale and Databricks implies that the semantic Lakehouse structure is prolonged to any presentation layer – would not matter whether it is Tableau, Energy BI , Excel or Looker. All of them can use the identical semantic layer in AtScale.

With the arrival of the lakehouse, organizations now not have their BI and AI/ML groups working in isolation. AtScale’s Common Semantic Layer helps organizations get constant entry to all of their enterprise information, regardless if it is a enterprise consumer in Excel or an information scientist utilizing a Pocket book, whereas leveraging the total energy of their Databricks Lakehouse Platform.
Further sources
Watch our panel dialogue with Franco Patano, lead product specialist at Databricks for extra data and to seek out out extra about how these instruments may also help you to create an agile, scalable analytics platform.
You probably have any questions relating to AtScale or modernize and migrate your legacy EDW, BI and reporting stack to Databricks and AtScale – be at liberty to achieve out to [email protected] or contact Databricks.
[ad_2]