Dave Mariani, CEOAccessing and storing relevant data has been a major task for companies, since the dawn of technology. In such a scenario, integration of analytics with Business Intelligence (BI) systems is an important step for gaining full return on investment. But big data comprises of large unformatted, unstructured data and transferring this data becomes a nightmare for many CIOs. Hadoop platform plays a significant role in processing and storing data quickly and transferring large files from node to node. Dave Mariani, CEO, AtScale exclaims, “We started AtScale not just to make Hadoop work for BI, but to put an end to the business compromises we have come to accept.” The San Mateo-based company allows users to access all the data and make decision that can improve their products and services.
AtScale offers services in Hadoop and Business Intelligence arena that supports the HDFS (Hadoop Distributed File System), MapReduce, Impala, Spark SQL, and Hive-Tez SQL-on-Hadoop engines. On top of that, AtScale turns Hadoop cluster into scale-out OLAP (Online Analytical Processing) server that helps clients to choose personalized BI tools Tableau, Qlikview to analyze huge data on Hadoop. The OLAP on Hadoop delivers a SQL (Structured Query Language) based tools to tap into the vast amount of data stored in Hadoop. With AtScale services viz. Zookeeper, Solr, Hive, Nagios, IBM Netezza, organizations help their BI professionals to create multiple database and making data easier for other companies to integrate Hadoop into an existing BI setup.
Moreover, AtScale technology creates a semantic layer between Hadoop and third-party BI tools for multidimensional analysis. It also provides big data, BI, and analytics to the masses and SQL and MDX-compliant interface, that helps business users to leverage their existing BI tools while providing direct, secure, and interactive access to data on Hadoop. The AtScale Intelligence Platform is designed to enable interactive, data analysis on Hadoop from within standard BI tools such as Microsoft Excel, Tableau Software or QlikView.
With AtScale, users can play with data ‘where it lays’ and analyze it at full speed and with more control
AtScale dynamic cubes integrate well with their existing BI tools while also providing a layer of governance to ensure standardization of business logic across data consumers. “AtScale is built to support concurrent connections from multiple BI clients, and as a result the underlying SQL-on-Hadoop engine must perform reliably under a concurrent workload of 10s or 100s of users,” states Mariani.
The ‘one-in-all’ Atscale platform supports BI on Hadoop without the need for any data movement, custom drivers, or separate clusters. Besides, AtScale ropes diverse Hadoop ecosystem technologies such as Apache, Spark, Kylin, Pig, Oozie, and Flume. “With AtScale, users can play with data ‘where it lays’ and analyze it at full speed and with more control,” affirms Mariani. AtScale works with all major Hadoop distributions Cloudera, MapR, and Hortonworks leveraging MDX (Managed DirectX) or SQL, over ODBC (Open Database Connectivity), JDBC (Java Database Connectivity), or OLE DB. With Hadoop, it offers a scalable architecture that can grow as quickly as the organization needs to grow, without increasing overhead costs commensurately.
Recently, Atscale has partnered with Tableau Software—the BI solution provider to deliver the “BI work on Hadoop” which aid business users to run analytics on Hadoop hundred times faster. Moving ahead, AtScale wants to provide unique software for business users with an enterprise grade experience on Hadoop. “Our mission is to bridge the gap between the existing BI ecosystem and Hadoop,” says Mariani.