Pentaho Adaptive Big Data LayerPentaho Business Analytics

Pentaho furthers innovation in Big Data integration and launches Pentaho Labs

Delivering the future of analytics, Pentaho Corporation today introduced a new adaptive big data layer in its platform that accelerates access and integration to the latest versions and capabilities of popular big data stores. It also announced a “think tank” called Pentaho Labs for innovating breakthrough big data-driven technologies in areas such as predictive and real-time analytics.

The Pentaho adaptive big data layer supports Hadoop distributions from Cloudera, Hortonworks, MapR and Intel, as well as popular NoSQL databases Cassandra and MongoDB, and introduces support for Splunk. With Pentaho, data can be accessed once then processed, combined and consumed anywhere. These new Pentaho big data innovations bring greater flexibility, insulation from change and increased competitive advantage to companies facing the relentless evolution of the big data ecosystem.

According to Richard Daley, founder and chief strategy officer at Pentaho, “The relatively breakneck speed at which big data analytics technology evolves as compared to the relational world is paralyzing many companies. The innovations we’re announcing today overcome this paralysis and allow companies to keep their big data technology options open, reduce risk and save considerable development time while taking advantage of the latest innovations in popular Big Data stores.”

PivotJ - PivotJ - Pentaho BI Plugin
PivotJ – PivotJ – Pentaho BI Plugin

Pentaho Adaptive Big Data LayerPentaho Business Analytics can literally plug into leading edge big data technologies with an advanced adaptive big data layer that supports the latest versions of Hadoop distributions, NoSQL databases and specialized big data stores. New capabilities include:• Hadoop distributions: Pentaho’s new adaptive big data layer supports the following Hadoop distributions: Cloudera CDH 4.1.2, 4.1.3, 4.2.0, 4.2.1; Intel’s IDH 2.3; Horton’s HDP 1.2.x and MapR 2.0.x and 2.1.x.• NoSQL databases: Pentaho also delivers support for the latest features in MongoDB and Cassandra.• Splunk: Machine data is one of the fastest growing and most pervasive segments of big data. Pentaho’s new Splunk adapter allows reading and writing data to Splunk. New Pentaho LabsPentaho Labs, led by Richard Daley, is staffed with top industry experts and a renowned data scientist to incubate breakthrough advanced analytic capabilities driven by big data. Pentaho Labs encourages seeding of new approaches and technologies that can over time be merged into the Pentaho roadmap based on market demand. According to Krishna Roy, BI and Analytics Analyst at 451 Research, “We are talking to many companies who are very interested in big data analytics, but lack the knowledge, skills and resources to keep up with the rapid pace of change in the ecosystem or even get started. Pentaho recognizes this and today’s announcements help lower the barriers to entry and accelerate innovation.

Cybersecurity Industry Leaders Launch the Cyber Threat Intelligence Capability Maturity Model

PRESS RELEASE

WILMINGTON, Del.–(BUSINESS WIRE) — Today, Intel 471, the premier provider of cyber intelligence-driven solutions worldwide, sponsored a partnership of 28 industry leaders serving public and private organizations across the vendor and consumer community. Together, these professionals volunteered their time, effort, and experience to launch the first version of the Cyber Threat Intelligence Capability Maturity Model (CTI-CMM), designed as the first-of-its kind vendor agnostic and universally applicable resource to support organizations of all shapes and sizes across the CTI industry. In today’s evolving threat landscape, the sign of a successful Cyber Threat Intelligence (CTI) program is a mature program that seamlessly integrates with an organization’s core objectives and key outcomes.

“Unlocking the full potential of your CTI program requires alignment with the capabilities of each stakeholder it supports, and a tangible measurement of success synchronized with organizational priorities,” said Michael DeBolt, Chief Intelligence Officer at Intel 471. “The CTI Capability Maturity Model (CTI-CMM) is designed to support CTI teams in building their capabilities by aligning to defined practices for stakeholder business domains unique to each organization. The Model establishes shared values and principles across the industry to empower organizations to take a holistic approach to cyber threat intelligence with stakeholders in mind.”

“Advising numerous clients globally, I have observed a consistent need for an outcome-focused model for cyber intelligence programs. The CTI-CMM bridges the gap to help CTI programs create impactful and demonstrable value for their organization,” said Colin Connor, CTI Services Manager at IBM X-Force.

The all-volunteer team behind the CTI-CMM is comprised of professionals representing a wide range of sectors, geographic regions, backgrounds and experiences, including leaders from Intel 471, IBM, Kroger, Venation, Mandiant, IntL8, Regfast, Trellix, Autodesk, Centre for Cybersecurity Belgium (CCB), Northwave Cyber Security, Workday, Marsh McLennan, Signify, Tidal Cyber, DeepSeas, BP, Gojek, SAND and many more. These individuals created CTI-CMM to elevate cyber threat intelligence across the industry through knowledge and experiences. Together, they defined the following values and principles to support the CTI community moving forward:

Shared Values

Intelligence provides value through collaboration with our stakeholders and supporting their decision-making process.

Intelligence is never completed. Improvement is continuous. This also applies to adoption. Constant improvement is crucial for success and distinguishing from other models which failed to keep up with the time.

Intelligence is not proprietary, nor is it prescriptive. Therefore, the model should never be claimed by a single commercial party.

Shared Principles

Contextualizing threat intelligence within risk

Continuous self-assessment and improvement

Actionable intelligence based on stakeholder needs

Quantitative and qualitative measurement of intelligence

Collaborative and iterative intelligence processes

This team made the decision to design the CTI-CMM to align with industry best practices and the concepts and format of a recognized cybersecurity maturity model, the Cybersecurity Capability Maturity Model (C2M2). Similar to the C2M2, the CTI-CMM is organized into ten domains. Each domain includes a “Domain Purpose” followed by a “CTI Mission” description describing how the CTI function supports it and consists of the CTI Use Cases and CTI Data Sources.

The CTI-CMM is the blueprint for a successful and effective CTI program. It exists to support the people who make decisions and take action to protect organizations. For more information, please visit: https://cti-cmm.org/

About Intel 471

Intel 471 empowers enterprises, government agencies, and other organizations to win the cybersecurity war using the real-time insights about adversaries, their relationships, threat patterns, and imminent attacks relevant to their businesses. The company’s platform collects, interprets, structures, and validates human-led, automation-enhanced intelligence, which fuels our external attack surface and advanced behavioral threat hunting solutions. Customers utilize this operationalized intelligence to drive a proactive response to neutralize threats and mitigate risk. Organizations across the globe leverage Intel 471’s world-class intelligence, our trusted practitioner engagement and enablement and globally dispersed ground expertise as their frontline guardian against the ever-evolving landscape of cyber threats to fight the adversary — and win. Learn more at https://intel471.com/.

Cybersecurity Industry Leaders Launch the Cyber Threat Intelligence Capability Maturity Model

PRESS RELEASE

WILMINGTON, Del.–(BUSINESS WIRE) — Today, Intel 471, the premier provider of cyber intelligence-driven solutions worldwide, sponsored a partnership of 28 industry leaders serving public and private organizations across the vendor and consumer community. Together, these professionals volunteered their time, effort, and experience to launch the first version of the Cyber Threat Intelligence Capability Maturity Model (CTI-CMM), designed as the first-of-its kind vendor agnostic and universally applicable resource to support organizations of all shapes and sizes across the CTI industry. In today’s evolving threat landscape, the sign of a successful Cyber Threat Intelligence (CTI) program is a mature program that seamlessly integrates with an organization’s core objectives and key outcomes.

“Unlocking the full potential of your CTI program requires alignment with the capabilities of each stakeholder it supports, and a tangible measurement of success synchronized with organizational priorities,” said Michael DeBolt, Chief Intelligence Officer at Intel 471. “The CTI Capability Maturity Model (CTI-CMM) is designed to support CTI teams in building their capabilities by aligning to defined practices for stakeholder business domains unique to each organization. The Model establishes shared values and principles across the industry to empower organizations to take a holistic approach to cyber threat intelligence with stakeholders in mind.”

“Advising numerous clients globally, I have observed a consistent need for an outcome-focused model for cyber intelligence programs. The CTI-CMM bridges the gap to help CTI programs create impactful and demonstrable value for their organization,” said Colin Connor, CTI Services Manager at IBM X-Force.

The all-volunteer team behind the CTI-CMM is comprised of professionals representing a wide range of sectors, geographic regions, backgrounds and experiences, including leaders from Intel 471, IBM, Kroger, Venation, Mandiant, IntL8, Regfast, Trellix, Autodesk, Centre for Cybersecurity Belgium (CCB), Northwave Cyber Security, Workday, Marsh McLennan, Signify, Tidal Cyber, DeepSeas, BP, Gojek, SAND and many more. These individuals created CTI-CMM to elevate cyber threat intelligence across the industry through knowledge and experiences. Together, they defined the following values and principles to support the CTI community moving forward:

Shared Values

Intelligence provides value through collaboration with our stakeholders and supporting their decision-making process.

Intelligence is never completed. Improvement is continuous. This also applies to adoption. Constant improvement is crucial for success and distinguishing from other models which failed to keep up with the time.

Intelligence is not proprietary, nor is it prescriptive. Therefore, the model should never be claimed by a single commercial party.

Shared Principles

Contextualizing threat intelligence within risk

Continuous self-assessment and improvement

Actionable intelligence based on stakeholder needs

Quantitative and qualitative measurement of intelligence

Collaborative and iterative intelligence processes

This team made the decision to design the CTI-CMM to align with industry best practices and the concepts and format of a recognized cybersecurity maturity model, the Cybersecurity Capability Maturity Model (C2M2). Similar to the C2M2, the CTI-CMM is organized into ten domains. Each domain includes a “Domain Purpose” followed by a “CTI Mission” description describing how the CTI function supports it and consists of the CTI Use Cases and CTI Data Sources.

The CTI-CMM is the blueprint for a successful and effective CTI program. It exists to support the people who make decisions and take action to protect organizations. For more information, please visit: https://cti-cmm.org/

About Intel 471

Intel 471 empowers enterprises, government agencies, and other organizations to win the cybersecurity war using the real-time insights about adversaries, their relationships, threat patterns, and imminent attacks relevant to their businesses. The company’s platform collects, interprets, structures, and validates human-led, automation-enhanced intelligence, which fuels our external attack surface and advanced behavioral threat hunting solutions. Customers utilize this operationalized intelligence to drive a proactive response to neutralize threats and mitigate risk. Organizations across the globe leverage Intel 471’s world-class intelligence, our trusted practitioner engagement and enablement and globally dispersed ground expertise as their frontline guardian against the ever-evolving landscape of cyber threats to fight the adversary — and win. Learn more at https://intel471.com/.

Cybersecurity Industry Leaders Launch the Cyber Threat Intelligence Capability Maturity Model

PRESS RELEASE

WILMINGTON, Del.–(BUSINESS WIRE) — Today, Intel 471, the premier provider of cyber intelligence-driven solutions worldwide, sponsored a partnership of 28 industry leaders serving public and private organizations across the vendor and consumer community. Together, these professionals volunteered their time, effort, and experience to launch the first version of the Cyber Threat Intelligence Capability Maturity Model (CTI-CMM), designed as the first-of-its kind vendor agnostic and universally applicable resource to support organizations of all shapes and sizes across the CTI industry. In today’s evolving threat landscape, the sign of a successful Cyber Threat Intelligence (CTI) program is a mature program that seamlessly integrates with an organization’s core objectives and key outcomes.

“Unlocking the full potential of your CTI program requires alignment with the capabilities of each stakeholder it supports, and a tangible measurement of success synchronized with organizational priorities,” said Michael DeBolt, Chief Intelligence Officer at Intel 471. “The CTI Capability Maturity Model (CTI-CMM) is designed to support CTI teams in building their capabilities by aligning to defined practices for stakeholder business domains unique to each organization. The Model establishes shared values and principles across the industry to empower organizations to take a holistic approach to cyber threat intelligence with stakeholders in mind.”

“Advising numerous clients globally, I have observed a consistent need for an outcome-focused model for cyber intelligence programs. The CTI-CMM bridges the gap to help CTI programs create impactful and demonstrable value for their organization,” said Colin Connor, CTI Services Manager at IBM X-Force.

The all-volunteer team behind the CTI-CMM is comprised of professionals representing a wide range of sectors, geographic regions, backgrounds and experiences, including leaders from Intel 471, IBM, Kroger, Venation, Mandiant, IntL8, Regfast, Trellix, Autodesk, Centre for Cybersecurity Belgium (CCB), Northwave Cyber Security, Workday, Marsh McLennan, Signify, Tidal Cyber, DeepSeas, BP, Gojek, SAND and many more. These individuals created CTI-CMM to elevate cyber threat intelligence across the industry through knowledge and experiences. Together, they defined the following values and principles to support the CTI community moving forward:

Shared Values

Intelligence provides value through collaboration with our stakeholders and supporting their decision-making process.

Intelligence is never completed. Improvement is continuous. This also applies to adoption. Constant improvement is crucial for success and distinguishing from other models which failed to keep up with the time.

Intelligence is not proprietary, nor is it prescriptive. Therefore, the model should never be claimed by a single commercial party.

Shared Principles

Contextualizing threat intelligence within risk

Continuous self-assessment and improvement

Actionable intelligence based on stakeholder needs

Quantitative and qualitative measurement of intelligence

Collaborative and iterative intelligence processes

This team made the decision to design the CTI-CMM to align with industry best practices and the concepts and format of a recognized cybersecurity maturity model, the Cybersecurity Capability Maturity Model (C2M2). Similar to the C2M2, the CTI-CMM is organized into ten domains. Each domain includes a “Domain Purpose” followed by a “CTI Mission” description describing how the CTI function supports it and consists of the CTI Use Cases and CTI Data Sources.

The CTI-CMM is the blueprint for a successful and effective CTI program. It exists to support the people who make decisions and take action to protect organizations. For more information, please visit: https://cti-cmm.org/

About Intel 471

Intel 471 empowers enterprises, government agencies, and other organizations to win the cybersecurity war using the real-time insights about adversaries, their relationships, threat patterns, and imminent attacks relevant to their businesses. The company’s platform collects, interprets, structures, and validates human-led, automation-enhanced intelligence, which fuels our external attack surface and advanced behavioral threat hunting solutions. Customers utilize this operationalized intelligence to drive a proactive response to neutralize threats and mitigate risk. Organizations across the globe leverage Intel 471’s world-class intelligence, our trusted practitioner engagement and enablement and globally dispersed ground expertise as their frontline guardian against the ever-evolving landscape of cyber threats to fight the adversary — and win. Learn more at https://intel471.com/.

Vertica QuickStart for Pentaho BI
Vertica QuickStart for Pentaho BI

The Coolest Business Analytics Companies Of The 2023 Big Data 100

Part 1 of CRN’s Big Data 100 takes a look at the vendors solution providers should know in the data analytics and business intelligence space.

Analytical Approach

It’s no surprise that in diagrams and visual representations of big data IT systems, data analytics and visualization software are usually at the top of the big data “technology stack.” They are the software tools that business analysts and information workers use to gain understanding and insight from the exponentially growing volumes of data businesses are generating today and to share that knowledge throughout an organization.

Pentaho Software Reviews, Demo & Pricing -
Pentaho Software Reviews, Demo & Pricing –

Worldwide sales of business intelligence software are expected to reach $25.73 billion this year, up 8.6 percent from $23.70 billion in 2022, and reach $34.16 billion by 2028, according to market research firm Statista.

“Companies’ needs for data insights, customer analyses, and all kinds of business processes have strongly increased due to digitization and data that is collected online. This development drives the demand for enterprise software, especially business intelligence software,” said a Statista report.

As part of the CRN 2023 Big Data 100, we’ve put together the following list of business analytics software companies—from well-established vendors to those in startup mode—that solution providers should be familiar with.

These vendors offer everything from self-service reporting and data visualization tools for nontechnical managers and business users to high-performance business analysis software needed by data analysts to tackle the most complex business intelligence tasks.

Pentaho Business Analytics .
Pentaho Business Analytics .

This week CRN is running the Big Data 100 list in a series of slide shows, organized by technology category, spotlighting vendors of business analytics software, database systems, data warehouse and data lake systems, data management and integration software, data observability tools, and big data systems and cloud platforms.

Some vendors market big data products that span multiple technology categories. They appear in the slideshow for the technology segment in which they are most prominent.

Ahana

Co-Founder and CEO: Steven Mih

Ahana develops Ahana Cloud for Presto, a software-as-a-service data analytics service based on Presto, the open-source SQL query engine used to query data in a range of data sources including database and data lakehouse systems.

Venture-backed Ahana, founded in 2020 and based in San Mateo, Calif., was acquired by IBM earlier this month. With the acquisition IBM joined the Presto Foundation, part of the Linux Foundation.

Alteryx

CEO: Mark Anderson

Alteryx offers its flagship Alteryx Analytics Automation Platform, a unified data analytics and data science automation system, and the Alteryx Analytics Cloud Platform, along with a number of additional business intelligence, machine learning and developer tools.

In February the company unveiled new self-service and enterprise-grade capabilities to the enterprise edition.

Alteryx executives say the Irvine, Calif.-based company has recorded significant momentum in the channel after undertaking a major expansion of its partner program in 2022 as part of a shift to a more partner-centric sales strategy.

AtScale

CEO: Chris Lynch

AtScale develops semantic layer technology for business intelligence that the company says insulates data consumers from the complexity of working with raw data.

The AtScale software creates a business-oriented data model and virtualizes queries for cloud data platforms such as Amazon Redshift, Snowflake, Google Big Query and Databricks without the need for ETL (extract, transform, load) tools or data movement.

This year AtScale, headquartered in Boston, has expanded its connections to the Databricks platform, including the Databricks Lakehouse for Manufacturing, and achieved the Snowflake Ready Technology Valuation for business intelligence solutions. In April the company introduced code-first data modeling capabilities for its semantic layer platform.

CelerData

CEO: James Li

Startup CelerData markets a high-performance unified analytics platform based on the StarRocks massively parallel processing SQL database for real-time analytics. CelerData’s founders developed StarRocks in 2020 and earlier this year contributed it to the Linux Foundation.

In March CelerData, headquartered in Menlo Park, Calif., took aim at the fast-growing data lakehouse space with a new release of its software with a cloud-native architecture, real-time streaming analytics, and support for open data table formats Hudi, Iceberg and Delta Lake.

Domo

CEO: Josh James

Domo develops its namesake “data experience” platform, a cloud-based, mobile data analytics and visualization system that provides decision makers with real-time business data from a wide range of operational applications and data sources.

In March the American Fork, Utah-based company debuted Domo Cloud Amplifier, which the company says helps users unlock data across multiple cloud platforms using a single virtual layer.

On March 6, Domo said that John Mellor, who served as Domo’s chief strategy officer for three years and then CEO for more than a year, was stepping down and that Josh James, the company’s founder and original CEO, was again assuming the CEO post.

Hitachi Vantara

CEO: Gajen Kandiah

Hitachi Vantara, a wholly-owned subsidiary of Hitachi Ltd., offers a portfolio of data analytics, data storage and data operations products, the latter including data integration, catalog and optimization tools and content intelligence software. The product suite includes the company’s Pentaho data integration and analytics platform.

Incorta

CEO: Osama Elkady

Incorta develops the Open Data Delivery Platform, a system for acquiring, processing and analyzing raw, operational data from business applications in real time. The platform, with Incorta’s Direct Data Mapping technology, connects to more than 240 data sources.

In March, Incorta, based in San Mateo, Calif., joined Google Cloud’s Ready BigQuery and AlloyDB in initiatives, validating its technology for those Google Cloud systems and making it easier for businesses and organizations to move business application and ERP (enterprise resource planning) data into the cloud.

Kyligence

CEO: Luke Han

Kyligence Enterprise, the company’s flagship multi-dimensional analytics platform, is designed to analyze massive datasets with its data modeling, query and processing capabilities. The platform is based on the open-source Apache Kylin distributed OLAP engine that was developed by Kyligence’s founders.

Earlier this month San Jose, Calif.-based Kyligence announced the general availability of Kyligence Zen, an intelligent metrics platform for developing and centralizing all types of data metrics into a unified catalog system.

Kyvos Insights

CEO: Praveen Kankariya

Kyvos Insights offers a data analytics acceleration platform, based on the company’s Smart OLAP technology, that the company says improves the performance of business intelligence tools like Tableau, Google Cloud Looker and Microsoft Power BI.

MicroStrategy

President and CEO: Phong Lee

MicroStrategy is one of the long-time players in the data analytics arena and bills itself as the largest independent, publicly traded business intelligence company. Headquartered in Tysons Corner, Va., MicroStrategy offers its enterprise analytics and embedded analytics platforms as part of the company’s “intelligence everywhere” vision.

In February the company reported that revenue in 2022 was $499.3 million, down more than 2 percent from $510.8 million in 2021.

In addition to its analytics software business, MicroStrategy also acquires and holds Bitcoin and as of the end of 2022 owned 132,500 Bitcoin.

Promethium

CEO: Kaycee Lai

Startup Promethium says that its collaborative data virtualization platform accelerates data analytics projects by eliminating data management and analytics complexity.

Promethium touts its software as a key element for data fabric initiatives that connect an organization’s data for analytics, machine learning and other tasks without the need for traditional ETL tools and approaches.

Pyramid Analytics

CEO: Omri Kohl

Pyramid Analytics describes the company’s Pyramid Decision Intelligence Platform as a fully integrated, no-code, AI-driven system that combines data preparation, data science, and self-service and augmented business analytics capabilities.

The company has been accelerating the expansion of its channel alliances over the last year including establishing partnerships with technology companies, ISVs, VARs, consulting firms and systems integrators. In November the company hired former IBM and Oracle executive Bill Clayton as vice president of global partner sales.

Earlier this month Amsterdam-based Pyramid Analytics said it is expanding into Mexico and Latin America via a series of partner agreements with top regional technology consultants including Analytics Mate, AS Analytics, BACIT, EMC Software C.A. and Hopewell Systems C.A.

Qlik

CEO: Mike Capone

Qlik is one of the leading business analytics providers with its flagship Qlik Sense data analytics, discovery and visualization software.

The company has expanded beyond its core analytics offerings in recent years with a number of savvy acquisitions including data integration tech developer Attunity in 2019 and automated machine learning software developer Big Squid in 2021. Qlik is now in the process of acquiring Talend, a leading provider of data management and integration tools.

In March Qlik launched its Connector Factory initiative to broaden the range of SaaS applications, legacy software and data sources that can link to its data analytics and integration products.

Rockset

CEO: Venkat Venkataramani

Rockset offers a cloud-based search and analytics database service for developing real-time analytical applications that require low-latency, high-concurrency analytical queries and data aggregations. The service is built on the RocksDB real-time indexing database.

In January Rockset said the San Mateo, Calif.- based company recorded exponential growth in 2022, including annual recurring revenue that grew 3.6x during the year and a customer based that more than doubled in size.

Salesforce/Tableau

CEO: Marc Benioff

Tableau Software was one of the industry’s most popular data analytics and visualization tools when cloud application giant Salesforce acquired the company in August 2019 for $15.7 billion.

Since then, Salesforce has been integrating Tableau with its other products, including its Einstein Discovery for predictive analytics and its Slack messaging application.

Earlier this year Tableau was updated with new data mapping capabilities and a new “Data Stories” capability for Tableau Server that uses natural language to automatically create a fully customizable story in narrative form to help business users understand analytical results.

SAS

CEO: Jim Goodnight

SAS is one of the largest and most established companies in the big data space and offers one of the industry’s broadest business analytics product portfolios led by its flagship SAS Viya data analytics, AI and data management platform.

In addition to Viya SAS develops a number of business analytics applications for specific tasks, such as marketing fraud detection and risk management, and for specific industries including banking, insurance, life sciences and retail.

In June 2022, SAS expanded its expertise and technology offerings in financial risk management with its acquisition of Hawaii-based Kamakura and its portfolio of applications for the financial services industry.

Privately held SAS, headquartered in Cary, N.C., is looking to go public over the next year or so and is taking financial and operational steps in preparation for an initial public offering.

Sisense

CEO: Amir Orad

The Sisense business analytics portfolio includes the Sisense Fusion Analytics platform, Sisense Fusion Embed for building analytical capabilities into applications and workflows, and Sisense Infusion Applications that provide data to users of everyday business applications including Microsoft Office 365 and Teams, Google Sheets and Slack.

Starburst

CEO: Justin Borgman

Starburst develops a data analytics platform, Starburst Enterprise, that can analyze huge volumes of data distributed across multiple locations – an alternative to the traditional approach of collecting and consolidating data in a central data warehouse. The company also offers Starburst Galaxy, a fully managed data lake analytics platform for handling petabyte-scale datasets.

The company’s software is built on the open-source Trino distributed SQL query engine. In June 2022 Starburst acquired Varada, an Israeli developer of data lakes acceleration technology. In April the company announced new integration with dbt Cloud, allowing data teams to more easily build data pipelines spanning multiple data sources on one central plane.

In early 2022 Starburst raised $250 million in a Series D funding round that at the time put the startup’s valuation at $3.35 billion.

ThoughtSpot

CEO: Sudheesh Nair

ThoughtSpot is one of the leading vendors in the data analytics space with its search-driven ThoughtSpot Analytics and ThoughtSpot Everywhere software. Earlier this year the company debuted ThoughtSpot Sage, which provides natural language analytics through GPT-3 integration.

Through 2020 and 2021 the company made a major pivot to cloud computing with its Modern Analytics Cloud offering.

In March ThoughtSpot, headquartered in Mountain View, Calif., announced an expanded strategic partnership with Google Cloud under which ThoughtSpot runs natively on the Google Cloud Platform and is integrated with a number of Google Cloud services.

Tibco/Cloud Software Group

CEO: Tom Krause

Tibco offers a portfolio of data management, integration and analytics products assembled through a series of acquisitions and organized as “Connect,” “Unify” and “Predict” products. Tibco Spotfire, for example, is the company’s flagship data analysis software while Tibco Statistica has been renamed Tibco Data Science.

In 2022 Tibco, owned by Vista Equity Partners, was merged with cloud and virtualization tech vendor Citrix Systems (acquired by Vista and Elliott Management affiliate Evergreen Coast Capital) in a $16.5 billion deal to create the Cloud Software Group.

Pentaho furthers innovation in Big Data integration and launches Pentaho Labs

Delivering the future of analytics, Pentaho Corporation today introduced a new adaptive big data layer in its platform that accelerates access and integration to the latest versions and capabilities of popular big data stores. It also announced a “think tank” called Pentaho Labs for innovating breakthrough big data-driven technologies in areas such as predictive and real-time analytics.

The Pentaho adaptive big data layer supports Hadoop distributions from Cloudera, Hortonworks, MapR and Intel, as well as popular NoSQL databases Cassandra and MongoDB, and introduces support for Splunk. With Pentaho, data can be accessed once then processed, combined and consumed anywhere. These new Pentaho big data innovations bring greater flexibility, insulation from change and increased competitive advantage to companies facing the relentless evolution of the big data ecosystem.

According to Richard Daley, founder and chief strategy officer at Pentaho, “The relatively breakneck speed at which big data analytics technology evolves as compared to the relational world is paralyzing many companies. The innovations we’re announcing today overcome this paralysis and allow companies to keep their big data technology options open, reduce risk and save considerable development time while taking advantage of the latest innovations in popular Big Data stores.”

Pentaho Adaptive Big Data LayerPentaho Business Analytics can literally plug into leading edge big data technologies with an advanced adaptive big data layer that supports the latest versions of Hadoop distributions, NoSQL databases and specialized big data stores. New capabilities include:• Hadoop distributions: Pentaho’s new adaptive big data layer supports the following Hadoop distributions: Cloudera CDH 4.1.2, 4.1.3, 4.2.0, 4.2.1; Intel’s IDH 2.3; Horton’s HDP 1.2.x and MapR 2.0.x and 2.1.x.• NoSQL databases: Pentaho also delivers support for the latest features in MongoDB and Cassandra.• Splunk: Machine data is one of the fastest growing and most pervasive segments of big data. Pentaho’s new Splunk adapter allows reading and writing data to Splunk. New Pentaho LabsPentaho Labs, led by Richard Daley, is staffed with top industry experts and a renowned data scientist to incubate breakthrough advanced analytic capabilities driven by big data. Pentaho Labs encourages seeding of new approaches and technologies that can over time be merged into the Pentaho roadmap based on market demand. According to Krishna Roy, BI and Analytics Analyst at 451 Research, “We are talking to many companies who are very interested in big data analytics, but lack the knowledge, skills and resources to keep up with the rapid pace of change in the ecosystem or even get started. Pentaho recognizes this and today’s announcements help lower the barriers to entry and accelerate innovation.

Hitachi Vantara Delves Deeper Into Streaming Data Analytics With Pentaho 8.0 Release

Hitachi Vantara has begun shipping Pentaho 8.0, a new release of the company’s business analytics software with advanced connectivity to streaming data sources for real-time business analytics.

As past practices of analyzing historical data give way to analyzing data in real time or near-real-time, businesses are increasingly demanding that their business intelligence tools be capable of handling streaming data from Internet of Things networks, social applications and cloud systems.

Market researcher IDC forecasts that the volume of generated data will increase by a factor of 10 by 2025. More importantly, 25 percent of that data will be real time – with 95 percent of that data streaming from IoT systems.

[Related: Hitachi Vantara Looks To Widen IoT Appeal With New Exec Team, Channel Friendly Appliance]

“A lot of businesses have come to the conclusion that data is relevant. But they have also discovered that they don’t know how to integrate data, analyze it and use it to gain insights,” said Dennis Wilbrink, a data and analytics consultant at Incentro, a Netherlands-based solution provider and longtime Pentaho partner.

Anticipating the growing demand for ways to derive value from all that data, Hitachi acquired business analytics software developer Pentaho in 2015.

The company recently combined the Pentaho business with its Hitachi Data Systems and Hitachi Insight Group operations within Hitachi Vantara, a wholly owned subsidiary.

“Pentaho 8.0 is really all about improving connectivity to these real-time data streams,” said Arik Pelkey, senior director of Pentaho product marketing, in an interview with CRN.

Pentaho 8.0 offers improved connectivity to streaming data sources, most notably the Kafka Streams publish/subscribe messaging system that handles large data volumes in a growing number of IT organizations. The 8.0 edition specifically enables real-time processing with specialized steps that connect Pentaho Data Integration to Kafka.

The new release also fully enables stream data ingestion and processing using either the software’s native engine or the Spark in-memory processing engine.

Those new capabilities will help meet the demands that Incentro is seeing from customers for the ability to handle streaming data from IoT systems, automotive systems and financial service systems, said Wilbrink in an interview with CRN.

“The notion of streaming data is becoming increasingly important to critical business operations,” Incentro’s Wilbrink said, pointing to use cases in financial services and telecommunications.

While Wilbrink said Pentaho 7.1 offered some real-time analytical capabilities, “with the release of 8.0, they have a done a lot of development on the streaming data side.”

The new Pentaho release builds on its enterprise-level security for Cloudera and Hortonworks platforms by supporting the Knox Gateway for authenticating users for Hadoop services.

The 8.0 edition also provides a number of new features and functions to help IT optimize data processing resources.

Adaptive execution, which matches workloads to the most appropriate processing engine without rewriting data integration logic, was introduced in Pentaho 7.1. The 8.0 release makes it easier to set up, use and secure adaptive execution. It also makes adaptive execution available for the Hortonworks platform.

The 8.0 release allows IT managers to utilize additional compute nodes and spread workloads across all available computation resources to match demand. It also supports the popular Avro and Parquet big data file formats.

Wilbrink said the enhancements move data processing closer to where the data resides, significantly speeding up processing times.

The Incentro data and analytics consultant said he has been evaluating the community edition of Pentaho 8.0, which came out earlier, and has been waiting for the general availability of the commercial release.

Another focus of the 8.0 release is speeding up the development and implementation times for business analysis projects.

“There’s a shortage of big data developers. So you want to get the most out of those resources,” Pelkey said, noting that between 60 percent and 80 percent of business analysis projects’ time is spent on data preparation.

Pentaho furthers innovation in Big Data integration and launches Pentaho Labs

Delivering the future of analytics, Pentaho Corporation today introduced a new adaptive big data layer in its platform that accelerates access and integration to the latest versions and capabilities of popular big data stores. It also announced a “think tank” called Pentaho Labs for innovating breakthrough big data-driven technologies in areas such as predictive and real-time analytics.

The Pentaho adaptive big data layer supports Hadoop distributions from Cloudera, Hortonworks, MapR and Intel, as well as popular NoSQL databases Cassandra and MongoDB, and introduces support for Splunk. With Pentaho, data can be accessed once then processed, combined and consumed anywhere. These new Pentaho big data innovations bring greater flexibility, insulation from change and increased competitive advantage to companies facing the relentless evolution of the big data ecosystem.

According to Richard Daley, founder and chief strategy officer at Pentaho, “The relatively breakneck speed at which big data analytics technology evolves as compared to the relational world is paralyzing many companies. The innovations we’re announcing today overcome this paralysis and allow companies to keep their big data technology options open, reduce risk and save considerable development time while taking advantage of the latest innovations in popular Big Data stores.”

Pentaho Adaptive Big Data LayerPentaho Business Analytics can literally plug into leading edge big data technologies with an advanced adaptive big data layer that supports the latest versions of Hadoop distributions, NoSQL databases and specialized big data stores. New capabilities include:• Hadoop distributions: Pentaho’s new adaptive big data layer supports the following Hadoop distributions: Cloudera CDH 4.1.2, 4.1.3, 4.2.0, 4.2.1; Intel’s IDH 2.3; Horton’s HDP 1.2.x and MapR 2.0.x and 2.1.x.• NoSQL databases: Pentaho also delivers support for the latest features in MongoDB and Cassandra.• Splunk: Machine data is one of the fastest growing and most pervasive segments of big data. Pentaho’s new Splunk adapter allows reading and writing data to Splunk. New Pentaho LabsPentaho Labs, led by Richard Daley, is staffed with top industry experts and a renowned data scientist to incubate breakthrough advanced analytic capabilities driven by big data. Pentaho Labs encourages seeding of new approaches and technologies that can over time be merged into the Pentaho roadmap based on market demand. According to Krishna Roy, BI and Analytics Analyst at 451 Research, “We are talking to many companies who are very interested in big data analytics, but lack the knowledge, skills and resources to keep up with the rapid pace of change in the ecosystem or even get started. Pentaho recognizes this and today’s announcements help lower the barriers to entry and accelerate innovation.

Leave a Comment