May
19
2020
--

Microsoft launches Azure Synapse Link to help enterprises get faster insights from their data

At its Build developer conference, Microsoft today announced Azure Synapse Link, a new enterprise service that allows businesses to analyze their data faster and more efficiently, using an approach that’s generally called “hybrid transaction/analytical processing” (HTAP). That’s a mouthful; it essentially enables enterprises to use the same database system for analytical and transactional workloads on a single system. Traditionally, enterprises had to make some trade-offs between either building a single system for both that was often highly over-provisioned or maintain separate systems for transactional and analytics workloads.

Last year, at its Ignite conference, Microsoft announced Azure Synapse Analytics, an analytics service that combines analytics and data warehousing to create what the company calls “the next evolution of Azure SQL Data Warehouse.” Synapse Analytics brings together data from Microsoft’s services and those from its partners and makes it easier to analyze.

“One of the key things, as we work with our customers on their digital transformation journey, there is an aspect of being data-driven, of being insights-driven as a culture, and a key part of that really is that once you decide there is some amount of information or insights that you need, how quickly are you able to get to that? For us, time to insight and a secondary element, which is the cost it takes, the effort it takes to build these pipelines and maintain them with an end-to-end analytics solution, was a key metric we have been observing for multiple years from our largest enterprise customers,” said Rohan Kumar, Microsoft’s corporate VP for Azure Data.

Synapse Link takes the work Microsoft did on Synaps Analytics a step further by removing the barriers between Azure’s operational databases and Synapse Analytics, so enterprises can immediately get value from the data in those databases without going through a data warehouse first.

“What we are announcing with Synapse Link is the next major step in the same vision that we had around reducing the time to insight,” explained Kumar. “And in this particular case, a long-standing barrier that exists today between operational databases and analytics systems is these complex ETL (extract, transform, load) pipelines that need to be set up just so you can do basic operational reporting or where, in a very transactionally consistent way, you need to move data from your operational system to the analytics system, because you don’t want to impact the performance of the operational system in any way because that’s typically dealing with, depending on the system, millions of transactions per second.”

ETL pipelines, Kumar argued, are typically expensive and hard to build and maintain, yet enterprises are now building new apps — and maybe even line of business mobile apps — where any action that consumers take and that is registered in the operational database is immediately available for predictive analytics, for example.

From the user perspective, enabling this only takes a single click to link the two, while it removes the need for managing additional data pipelines or database resources. That, Kumar said, was always the main goal for Synapse Link. “With a single click, you should be able to enable real-time analytics on your operational data in ways that don’t have any impact on your operational systems, so you’re not using the compute part of your operational system to do the query, you actually have to transform the data into a columnar format, which is more adaptable for analytics, and that’s really what we achieved with Synapse Link.”

Because traditional HTAP systems on-premises typically share their compute resources with the operational database, those systems never quite took off, Kumar argued. In the cloud, with Synapse Link, though, that impact doesn’t exist because you’re dealing with two separate systems. Now, once a transaction gets committed to the operational database, the Synapse Link system transforms the data into a columnar format that is more optimized for the analytics system — and it does so in real time.

For now, Synapse Link is only available in conjunction with Microsoft’s Cosmos DB database. As Kumar told me, that’s because that’s where the company saw the highest demand for this kind of service, but you can expect the company to add support for available in Azure SQL, Azure Database for PostgreSQL and Azure Database for MySQL in the future.

Apr
22
2020
--

Fishtown Analytics raises $12.9M Series A for its open-source analytics engineering tool

Philadelphia-based Fishtown Analytics, the company behind the popular open-source data engineering tool dbt, today announced that it has raised a $12.9 million Series A round led by Andreessen Horowitz, with the firm’s general partner Martin Casado joining the company’s board.

“I wrote this blog post in early 2016, essentially saying that analysts needed to work in a fundamentally different way,” Fishtown founder and CEO Tristan Handy told me, when I asked him about how the product came to be. “They needed to work in a way that much more closely mirrored the way the software engineers work and software engineers have been figuring this shit out for years and data analysts are still like sending each other Microsoft Excel docs over email.”

The dbt open-source project forms the basis of this. It allows anyone who can write SQL queries to transform data and then load it into their preferred analytics tools. As such, it sits in-between data warehouses and the tools that load data into them on one end, and specialized analytics tools on the other.

As Casado noted when I talked to him about the investment, data warehouses have now made it affordable for businesses to store all of their data before it is transformed. So what was traditionally “extract, transform, load” (ETL) has now become “extract, load, transform” (ELT). Andreessen Horowitz is already invested in Fivetran, which helps businesses move their data into their warehouses, so it makes sense for the firm to also tackle the other side of this business.

“Dbt is, as far as we can tell, the leading community for transformation and it’s a company we’ve been tracking for at least a year,” Casado said. He also argued that data analysts — unlike data scientists — are not really catered to as a group.

Before this round, Fishtown hadn’t raised a lot of money, even though it has been around for a few years now, except for a small SAFE round from Amplify.

But Handy argued that the company needed this time to prove that it was on to something and build a community. That community now consists of more than 1,700 companies that use the dbt project in some form and over 5,000 people in the dbt Slack community. Fishtown also now has over 250 dbt Cloud customers and the company signed up a number of big enterprise clients earlier this year. With that, the company needed to raise money to expand and also better service its current list of customers.

“We live in Philadelphia. The cost of living is low here and none of us really care to make a quadro-billion dollars, but we do want to answer the question of how do we best serve the community,” Handy said. “And for the first time, in the early part of the year, we were like, holy shit, we can’t keep up with all of the stuff that people need from us.”

The company plans to expand the team from 25 to 50 employees in 2020 and with those, the team plans to improve and expand the product, especially its IDE for data analysts, which Handy admitted could use a bit more polish.

Feb
24
2020
--

Databricks makes bringing data into its ‘lakehouse’ easier

Databricks today announced the launch of its new Data Ingestion Network of partners and the launch of its Databricks Ingest service. The idea here is to make it easier for businesses to combine the best of data warehouses and data lakes into a single platform — a concept Databricks likes to call “lakehouse.”

At the core of the company’s lakehouse is Delta Lake, Databricks’ Linux Foundation-managed open-source project that brings a new storage layer to data lakes that helps users manage the lifecycle of their data and ensures data quality through schema enforcement, log records and more. Databricks users can now work with the first five partners in the Ingestion Network — Fivetran, Qlik, Infoworks, StreamSets, Syncsort — to automatically load their data into Delta Lake. To ingest data from these partners, Databricks customers don’t have to set up any triggers or schedules — instead, data automatically flows into Delta Lake.

“Until now, companies have been forced to split up their data into traditional structured data and big data, and use them separately for BI and ML use cases. This results in siloed data in data lakes and data warehouses, slow processing and partial results that are too delayed or too incomplete to be effectively utilized,” says Ali Ghodsi, co-founder and CEO of Databricks. “This is one of the many drivers behind the shift to a Lakehouse paradigm, which aspires to combine the reliability of data warehouses with the scale of data lakes to support every kind of use case. In order for this architecture to work well, it needs to be easy for every type of data to be pulled in. Databricks Ingest is an important step in making that possible.”

Databricks VP of Product Marketing Bharath Gowda also tells me that this will make it easier for businesses to perform analytics on their most recent data and hence be more responsive when new information comes in. He also noted that users will be able to better leverage their structured and unstructured data for building better machine learning models, as well as to perform more traditional analytics on all of their data instead of just a small slice that’s available in their data warehouse.

Feb
19
2020
--

Google Cloud opens its Seoul region

Google Cloud today announced that its new Seoul region, its first in Korea, is now open for business. The region, which it first talked about last April, will feature three availability zones and support for virtually all of Google Cloud’s standard service, ranging from Compute Engine to BigQuery, Bigtable and Cloud Spanner.

With this, Google Cloud now has a presence in 16 countries and offers 21 regions with a total of 64 zones. The Seoul region (with the memorable name of asia-northeast3) will complement Google’s other regions in the area, including two in Japan, as well as regions in Hong Kong and Taiwan, but the obvious focus here is on serving Korean companies with low-latency access to its cloud services.

“As South Korea’s largest gaming company, we’re partnering with Google Cloud for game development, infrastructure management, and to infuse our operations with business intelligence,” said Chang-Whan Sul, the CTO of Netmarble. “Google Cloud’s region in Seoul reinforces its commitment to the region and we welcome the opportunities this initiative offers our business.”

Over the course of this year, Google Cloud also plans to open more zones and regions in Salt Lake City, Las Vegas and Jakarta, Indonesia.

Jan
09
2020
--

Sisense nabs $100M at a $1B+ valuation for accessible big data business analytics

Sisense, an enterprise startup that has built a business analytics business out of the premise of making big data as accessible as possible to users — whether it be through graphics on mobile or desktop apps, or spoken through Alexa — is announcing a big round of funding today and a large jump in valuation to underscore its traction. The company has picked up $100 million in a growth round of funding that catapults Sisense’s valuation to over $1 billion, funding that it plans to use to continue building out its tech, as well as for sales, marketing and development efforts.

For context, this is a huge jump: The company was valued at only around $325 million in 2016 when it raised a Series E, according to PitchBook. (It did not disclose valuation in 2018, when it raised a venture round of $80 million.) It now has some 2,000 customers, including Tinder, Philips, Nasdaq and the Salvation Army.

This latest round is being led by the high-profile enterprise investor Insight Venture Partners, with Access Industries, Bessemer Venture Partners, Battery Ventures, DFJ Growth and others also participating. The Access investment was made via Claltech in Israel, and it seems that this led to some details of this getting leaked out as rumors in recent days. Insight is in the news today for another big deal: Wearing its private equity hat, the firm acquired Veeam for $5 billion. (And that speaks to a particular kind of trajectory for enterprise companies that the firm backs: Veeam had already been a part of Insight’s venture portfolio.)

Mature enterprise startups have proven their business cases are going to be an ongoing theme in this year’s fundraising stories, and Sisense is part of that theme, with annual recurring revenues of over $100 million speaking to its stability and current strength. The company has also made some key acquisitions to boost its business, such as the acquisition of Periscope Data last year (coincidentally, also for $100 million, I understand).

Its rise also speaks to a different kind of trend in the market: In the wider world of business intelligence, there is an increasing demand for more digestible data in order to better tap advances in data analytics to use it across organizations. This was also one of the big reasons why Salesforce gobbled up Tableau last year for a slightly higher price: $15.7 billion.

Sisense, bringing in both sleek end user products but also a strong theme of harnessing the latest developments in areas like machine learning and AI to crunch the data and order it in the first place, represents a smaller and more fleet of foot alternative for its customers. “We found a way to make accessing data extremely simple, mashing it together in a logical way and embedding it in every logical place,” explained CEO Amir Orad to us in 2018.

“We have enjoyed watching the Sisense momentum in the past 12 months, the traction from its customers as well as from industry leading analysts for the company’s cloud native platform and new AI capabilities. That coupled with seeing more traction and success with leading companies in our portfolio and outside, led us to want to continue and grow our relationship with the company and lead this funding round,” said Jeff Horing, managing director at Insight Venture Partners, in a statement.

To note, Access Industries is an interesting backer which might also potentially shape up to be strategic, given its ownership of Warner Music Group, Alibaba, Facebook, Square, Spotify, Deezer, Snap and Zalando.

“Given our investments in market leading companies across diverse industries, we realize the value in analytics and machine learning and we could not be more excited about Sisense’s trajectory and traction in the market,” added Claltech’s Daniel Shinar in a statement.

Jul
02
2019
--

Software development analytics platform Sourced launches an enterprise edition

Sourced, or source{d}, as the company styles its name, provides developers and IT departments with deeper analytics into their software development life cycle. It analyzes codebases, offers data about which APIs are being used and provides general information about developer productivity and other metrics. Today, Sourced is officially launching its Enterprise Edition, which gives IT departments and executives a number of advanced tools for managing their software portfolios and the processes they use to create them.

“Sourced enables large engineering organizations to better monitor, measure and manage their IT initiatives by providing a platform that empowers IT leaders with actionable data,” said the company’s CEO Eiso Kant. “The release of Sourced Enterprise is a major milestone towards proper engineering observability of the entire software development life cycle in enterprises.”

Engineering Effectiveness Efficiency

Because it’s one of the hallmarks of every good enterprise tools, it’s no surprise that Sourced Enterprise also offers features like role-based access control and other security features, as well as dedicated support and SLAs. IT departments also can run the service on-premise, or use it as a SaaS product.

The company also tells me that the enterprise version can handle larger codebases so that even complex queries over a large data set only takes a few seconds (or minutes if it’s a really large codebase). To create these complex queries, the enterprise edition includes a number of add-ons to allow users to create these advanced queries. “These are available upon request and tailored to help enterprises overcome specific challenges that often rely on machine learning capabilities, such as identity matching or code duplication analysis,” the company says.

Cloud Migration

The service integrates with most commonly used project management and business intelligence tools, but it also ships with Apache Superset, an open-source business intelligence application that offers built-in data visualization capabilities.

These visualization capabilities are also now part of the Sourced Community Edition, which is now available in private beta.

“Sourced Enterprise gave us valuable insights into the Cloud Foundry codebase evolution, development patterns, trends and dependencies, all presented in easy-to-digest dashboards,” said Chip Childers, the CTO of the open-source Cloud Foundry Foundation, which tested the Enterprise Edition ahead of its launch. “If you really want to understand what’s going on in your codebase and engineering department, Sourced is the way to go.”

To date, the company has raised $10 million from First VC, Heartcore Capital, Xavier Niel and others.

Talent Assessment Managment

Jul
01
2019
--

Video platform Kaltura adds advanced analytics

You may not be familiar with Kaltura‘s name, but chances are you’ve used the company’s video platform at some point or another, given that it offers a variety of video services for enterprises, educational institutions and video-on-demand platforms, including HBO, Phillips, SAP, Stanford and others. Today, the company announced the launch of an advanced analytics platform for its enterprise and educational users.

This new platform, dubbed Kaltura Analytics for Admins, will provide its users with features like user-level reports. This may sound like a minor feature, because you probably don’t care about the exact details of a given user’s interactions with your video, but it will allow businesses to link this kind of behavior to other metrics. With this, you could measure the ROI of a given video by linking video watch time and sales, for example. This kind of granularity wasn’t possible with the company’s existing analytics systems. Companies and schools using the product will also get access to time-period comparisons to help admins identify trends, deeper technology and geolocation reports, as well as real-time analytics for live events.

eCDN QoS dashboard

“Video is a unique data type in that it has deep engagement indicators for measurement, both around video creation — what types of content are being created by whom, as well as around video consumption and engagement with content — what languages were selected for subtitles, what hot-spots were clicked upon in video,” said Michal Tsur, president and general manager of Enterprise and Learning at Kaltura. “Analytics is a very strategic area for our customers. Both for tech companies who are building on our VPaaS, as well as for large organizations and universities that use our video products for learning, communication, collaboration, knowledge management, marketing and sales.”

Tsur also tells me the company is looking at how to best use machine learning to give its customers even deeper insights into how people watch videos — and potentially even offer predictive analytics in the long run.

Mar
06
2019
--

Clari platform aims to unify go-to-market operations data

Clari started as a company that wanted to give sales teams more information about their sales process than could be found in the CRM database. Today, the company announced a much broader platform, one that can provide insight across sales, marketing and customer service to give a more unified view of a company’s go-to-market operations, all enhanced by AI.

Company co-founder and CEO Andy Byrne says this involves pulling together a variety of data and giving each department the insight to improve their mission. “We are analyzing large volumes of data found in various revenue systems — sales, marketing, customer success, etc. — and we’re using that data to provide a new platform that’s connecting up all of the different revenue departments,” Byrne told TechCrunch.

For sales, that would mean driving more revenue. For marketing it would it involve more targeted plans to drive more sales. And for customer success it would be about increasing customer retention and reducing churn.

Screenshot: ClariThe company’s original idea when it launched in 2012 was looking at a range of data that touched the sales process, such as email, calendars and the CRM database, to bring together a broader view of sales than you could get by looking at the basic customer data stored in the CRM alone. The Clari data could tell the reps things like which deals would be most likely to close and which ones were at risk.

“We were taking all of these signals that had been historically disconnected from each other and we were connecting it all into a new interface for sales teams that’s very different than a CRM,” Byrne said.

Over time, that involved using AI and machine learning to make connections in the data that humans might not have been seeing. The company also found that customers were using the product to look at processes adjacent to sales, and they decided to formalize that and build connectors to relevant parts of the go-to-market system like marketing automation tools from Marketo or Eloqua and customer tools such as Dialpad, Gong.io and Salesloft.

With Clari’s approach, companies can get a unified view without manually pulling all this data together. The goal is to provide customers with a broad view of the go-to-market operation that isn’t possible looking at siloed systems.

The company has experienced tremendous growth over the last year, leaping from 80 customers to 250. These include Okta and Alteryx, two companies that went public in recent years. Clari is based in the Bay Area and has around 120 employees. It has raised more than $60 million. The most recent round was a $35 million Series C last May led by Tenaya Capital.

Feb
14
2019
--

Zoho’s office suite gets smarter

As far as big tech companies go, Zoho is a bit different. Not only has it never taken any venture funding, it also offers more than 40 products that range from its online office suite to CRM and HR tools, email, workflow automation services, video conferencing, a bug tracker and everything in-between. You don’t often hear about it, but the company has more than 45 million users worldwide and offices in the U.S., Netherlands, Singapore, Dubai, Yokohama and Beijing — and it owns its data centers, too.

Today, Zoho is launching a major update to its core office suite products: Zoho Writer, Sheet, Show and Notebooks. These tools are getting an infusion of AI — under Zoho’s “Zia” brand — as well as new AppleTV and Android integrations and more. All of the tools are getting some kind of AI-based feature or another, but they are also getting support for Zia Voice, Zoho’s conversational AI assistant.

With this, you can now ask questions about data in your spreadsheets, for example, and Zia will create charts and even pivot tables for you. Similarly, Zoho is using Zia in its document editor and presentation tools to provide better grammar and spellchecking tools (and it’ll now offer a readability score and tips for improving your text). In Zoho Notebook, the note-taking application that is also the company’s newest app, Zia can help users create different formats for their note cards based on the content (text, photo, audio, checklist, sketch, etc.).

“We want to make AI helpful in a very contextual manner for a specific application,” Raju Vegesna, Zoho’s chief evangelist, told me. “Because we do AI across the board, we learned a lot and were are able to apply learnings on one technology and one piece of context and apply that to another.” Zoho first brought Zia to its business intelligence app, for example, and now it’s essentially bringing the same capabilities to its spreadsheet app, too.

It’s worth noting that Google and Microsoft are doing similar things with their productivity apps, too, of course. Zoho, however, argues that it offers a far wider range of applications — and its stated mission is that you should be able to run your entire business on its platform. And the plan is to bring some form of AI to all of them. “Fast-forward a few months and [our AI grammar and spellchecker] is applied to the business application context — maybe a support agent responding to a customer ticket can use this technology to make sure there are no typos in those responses,” Vegesna said.

There are plenty of other updates in this release, too. Zoho Show now works with AppleTV-enabled devices for example, and Android users can now use their phones as a smart remote for Show. Zoho Sheet now lets you build custom functions and scripts and Zoho Writer’s web, mobile and iPad versions can now work completely offline.

The broader context here, though, is that Zoho, with its ridiculously broad product portfolio, is playing a long game. The company has no interest in going public. But it also knows that it’s going up against companies like Google and Microsoft. “Vertical integration is not something that you see in our industry,” said Vegesna. “Companies are in that quick mode of getting traction, sell or go public. We are looking at it in the 10 to 20-year time frame. To really win that game, you need to make these serious investments in the market. The improvements you are seeing here are at the surface level. But we don’t see ourselves as a software company. We see ourselves as a technology company.” And to build up these capabilities, Vegesna said, Zoho has invested hundreds of millions of dollars into its own data centers in the U.S., Europe and Asia, for example.

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com