Jul
01
2019
--

Video platform Kaltura adds advanced analytics

You may not be familiar with Kaltura‘s name, but chances are you’ve used the company’s video platform at some point or another, given that it offers a variety of video services for enterprises, educational institutions and video-on-demand platforms, including HBO, Phillips, SAP, Stanford and others. Today, the company announced the launch of an advanced analytics platform for its enterprise and educational users.

This new platform, dubbed Kaltura Analytics for Admins, will provide its users with features like user-level reports. This may sound like a minor feature, because you probably don’t care about the exact details of a given user’s interactions with your video, but it will allow businesses to link this kind of behavior to other metrics. With this, you could measure the ROI of a given video by linking video watch time and sales, for example. This kind of granularity wasn’t possible with the company’s existing analytics systems. Companies and schools using the product will also get access to time-period comparisons to help admins identify trends, deeper technology and geolocation reports, as well as real-time analytics for live events.

eCDN QoS dashboard

“Video is a unique data type in that it has deep engagement indicators for measurement, both around video creation — what types of content are being created by whom, as well as around video consumption and engagement with content — what languages were selected for subtitles, what hot-spots were clicked upon in video,” said Michal Tsur, president and general manager of Enterprise and Learning at Kaltura. “Analytics is a very strategic area for our customers. Both for tech companies who are building on our VPaaS, as well as for large organizations and universities that use our video products for learning, communication, collaboration, knowledge management, marketing and sales.”

Tsur also tells me the company is looking at how to best use machine learning to give its customers even deeper insights into how people watch videos — and potentially even offer predictive analytics in the long run.

Jun
07
2019
--

Google continues to preach multi-cloud approach with Looker acquisition

When Google announced it was buying Looker yesterday morning for $2.6 billion, you couldn’t blame some of the company’s 1,600 customers if they worried a bit if Looker would continue its multi-cloud approach. But Google Cloud chief Thomas Kurian made clear the company will continue to support an open approach to its latest purchase when it joins the fold later this year.

It’s consistent with the messaging from Google Next, the company’s cloud conference in April. It was looking to portray itself as the more open cloud. It was going to be friendlier to open-source projects, running them directly on Google Cloud. It was going to provide a way to manage your workloads wherever they live, with Anthos.

Ray Wang, founder and principal analyst at Constellation Research, says that in a multi-cloud world, Looker represented one of the best choices, and that could be why Google went after it. “Looker’s strengths include its centralized data-modeling and governance, which promotes consistency and reuse. It runs on top of modern cloud databases including Google BigQuery, AWS Redshift and Snowflake,” Wang told TechCrunch. He added, “They wanted to acquire a tool that is as easy to use as Microsoft Power BI and as deep as Tableau.”

Patrick Moorhead, founder and principal analyst at Moor Insights & Strategy, also sees this deal as part of a consistent multi-cloud message from Google. “I do think it is in alignment with its latest strategy outlined at Google Next. It has talked about rich analytics tools that could pull data from disparate sources,” he said.

Kurian pushing the multi-cloud message

Google Cloud CEO Thomas Kurian, who took over from Diane Greene at the end of last year, was careful to emphasize the company’s commitment to multi-cloud and multi-database support in comments to media and analysts yesterday. “We first want to reiterate, we’re very committed to maintaining local support for other clouds, as well as to serve data from multiple databases because customers want a single analytics foundation for their organization, and they want to be able to in the analytics foundation, look at data from multiple data sources. So we’re very committed to that,” Kurian said yesterday.

From a broader customer perspective, Kurian sees Looker providing customers with a single way to access and visualize data. “One of the things that is challenging for organizations in operationalizing business intelligence, that we feel that Looker has done really well, is it gives you a single place to model your data, define your data definitions — like what’s revenue, who’s a gold customer or how many servers tickets are open — and allows you then to blend data across individual data silos, so that as an organization, you’re working off a consistent set of metrics,” Kurian explained.

In a blog post announcing the deal, Looker CEO Frank Bien sought to ease concerns that the company might move away from the multi-cloud, multi-database support. “For customers and partners, it’s important to know that today’s announcement solidifies ours as well as Google Cloud’s commitment to multi-cloud. Looker customers can expect continuing support of all cloud databases like Amazon Redshift, Azure SQL, Snowflake, Oracle, Microsoft SQL Server, Teradata and more,” Bien wrote in the post.

No antitrust concerns

Kurian also emphasized that this deal shouldn’t attract the attention of antitrust regulators, who have been sniffing around the big tech companies like Google/Alphabet, Apple and Amazon as of late. “We’re not buying any data along with this transaction. So it does not introduce any concentration risk in terms of concentrating data. Secondly, there are a large number of analytic tools in the market. So by just acquiring Looker, we’re not further concentrating the market in any sense. And lastly, all the other cloud players also have their own analytic tools. So it represents a further strengthening of our competitive position relative to the other players in the market,” he explained. Not to mention its pledge to uphold the multi-cloud and multi-database support, which should show it is not doing this strictly to benefit Google or to draw customers specifically to GCP.

Just this week, the company announced a partnership with Snowflake, the cloud data warehouse startup that has raised almost a billion dollars, to run on Google Cloud Platform. It already runs AWS and Microsoft Azure. In fact, Wang suggested that Snowflake could be next on Google’s radar as it tries to build a multi-cloud soup-to-nuts analytics offering.

Regardless, with Looker the company has a data analytics tool to complement its data processing tools, and together the two companies should provide a fairly comprehensive data solution. If they truly keep it multi-cloud, that should keep current customers happy, especially those who work with tools outside of the Google Cloud ecosystem or simply want to maintain their flexibility.

Jun
06
2019
--

Google to acquire analytics startup Looker for $2.6 billion

Google made a big splash this morning when it announced it’s going to acquire Looker, a hot analytics startup that’s raised more than $280 million. It’s paying $2.6 billion for the privilege and adding the company to Google Cloud.

Thomas Kurian, the man who was handed the reins to Google Cloud at the end of last year, sees the two companies bringing together a complete data analytics solution for customers. “The combination provides an end-to-end analytics platform to connect, collect, analyze and visualize data across Google Cloud, Azure, AWS, on-premises databases and ISV applications,” Kurian explained at a media event this morning.

Google Cloud has been mired in third place in the cloud infrastructure market, and grabbing Looker gives it an analytics company with a solid track record. The last time I spoke to Looker, it was announcing a hefty $103 million in funding on a $1.6 billion valuation. Today’s price is a nice even billion over that.

As I wrote at the time, Looker’s CEO Frank Bien wasn’t all that interested in bragging about valuations; he wanted to talk about what he considered more important numbers:

He reported that the company has 1,600 customers now and just crossed the $100 million revenue run rate, a significant milestone for any enterprise SaaS company. What’s more, Bien reports revenue is still growing 70 percent year over year, so there’s plenty of room to keep this going.

Today, in a media briefing on the deal, he said that from the start, his company was really trying to disrupt the business intelligence and analytics market. “What we wanted to do was disrupt this pretty staid ecosystem of data visualization tools and data prep tools that companies were being forced to build solutions. We thought it was time to rationalize a new platform for data, a single place where we could really reconstitute a single view of information and make it available in the enterprise for business purposes,” he said.

Diagram: Google & Looker

Slide: Google & Looker

Bien saw today’s deal as a chance to gain the scale of the Google cloud platform, and as successful as the company has been, it’s never going to have the reach of Google Cloud. “What we’re really leveraging here, and I think the synergy with Google Cloud, is that this data infrastructure revolution and what really emerged out of the Big Data trend was very fast, scalable — and now in the cloud — easy to deploy data infrastructure,” he said.

 

Kurian also emphasized that the company will intend to support multiple databases and multiple deployment strategies, whether multi-cloud, hybrid or on premises.

Perhaps, it’s not a coincidence that Google went after Looker as the two companies had a strong existing partnership and 350 common customers, according to Google. “We have many common customers we’ve worked with. One of the great things about this acquisition is that the two companies have known each other for a long time, we share very common culture,” Kurian said.

This is a huge deal for Google Cloud, easily topping the $625 million it paid for Apigee in 2016. It marks the first major deal in the Kurian era as Google tries to beef up its market share. While the two companies share common customers, the addition of Looker should bring a net gain that could help them upsell to other parts of the Looker customer base.

Per usual, this deal is going to be subject to regulatory approval, but it is expected to close later this year if all goes well.

May
08
2019
--

Sumo Logic announces $110M Series G investment on valuation over $1B

Sumo Logic, a cloud data analytics and log analysis company, announced a $110 million Series G investment today. The company indicated that its valuation was “north of a billion dollars,” but wouldn’t give an exact figure.

Today’s round was led by Battery Ventures with participation from new investors Tiger Global Management and Franklin Templeton. Other unnamed existing investors also participated, according to the company. Today’s investment brings the total raised to $345 million.

When we spoke to Sumo Logic CEO Ramin Sayar at the time of its $75 million Series F in 2017, he indicated the company was on its way to becoming a public company. While that hasn’t happened yet, he says it is still the goal for the company, and investors wanted in on that before it happened.

“We don’t need to raise capital. We had plenty of capital already, but when you bring on crossover investors and others in this stage of a company, they have minimum check sizes and they have a lot of appetite to help you as you get ready to address a lot of the challenges and opportunities as you become a public company,” he said.

He says the company will be investing the money in continuing to develop the platform, whether that’s through acquisitions, which of course the money would help with, or through the company’s own engineering efforts.

The IPO idea remains a goal, but Sayar was not willing or able to commit to when that might happen. The company clearly has plenty of runway now to last for quite some time.

“We could go out now if we wanted to, but we made a decision that that’s not what we’re going to do, and we’re going to continue to double down and invest, and therefore bring some more capital in to give us more optionality for strategic tuck-ins and product IP expansion, international expansion — and then look to the public markets [after] we do that,” he said.

Dharmesh Thakker, general partner at investor Battery Ventures, says his firm likes Sumo Logic’s approach and sees a big opportunity ahead with this investment. “We have been tracking the Sumo Logic team for some time, and admire the company’s early understanding of the massive cloud-native opportunity and the rise of new, modern application architectures,” he said in a statement.

The company crossed the $100 million revenue mark last year and has 2,000 customers, including Airbnb, Anheuser-Busch and Samsung. It competes with companies like Splunk, Scalyr and Loggly.

Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Sep
12
2018
--

Sisense hauls in $80M investment as data analytics business matures

Sisense, a company that helps customers understand and visualize their data across multiple sources, announced an $80 million Series E investment today led by Insight Venture Partners. They also announced that Zack Urlocker, former COO at Duo Security and Zendesk, has joined the organization’s board of directors.

The company has attracted a prestigious list of past investors, who also participated in the round, including Battery Ventures, Bessemer Venture Partners, DFJ Venture Capital, Genesis Partners and Opus Capital. Today’s investment brings the total raised to close to $200 million.

CEO Amir Orad says investors like their mission of simplifying complex data with analytics and business intelligence and delivering it in whatever way makes sense. That could be on screens throughout the company, desktop or smartphone, or via Amazon Alexa. “We found a way to make accessing data extremely simple, mashing it together in a logical way and embedding it in every logical place,” he explained.

It appears to be resonating. The company has over 1000 customers including Expedia, Oppenheimer and Phillips to name but a few. Orad says they are actually the analytics engine behind Nasdaq Corporate Solutions, which is the the main investor relations system used by CFOs.

He was not in the mood to discuss the company’s valuation, an exercise he called “an ego boost he doesn’t relate to.” He says that he would prefer to be measured by how efficiently he uses the money investors give him or by customer satisfaction scores. Nor would he deal with IPO speculation. All he would say on that front was, “When you focus on the value you bring, positive things happen.”

In spite of that, he was clearly excited about having Urlocker join the board. He says the two spent six months getting to know each other and he sees a guy who has brought several companies to successful exit joining his team, and perhaps someone who can help him bring his company across the finish line, however that ultimately happens. Just last month, Cisco bought Urlocker’s former company, Duo Security for $2.35 billion.

For now Sisense, which launched in 2010, has another $80 million in the bank. They plan to add to the nearly 500 employees already in place in offices in New York, Tel Aviv, Kiev, Tokyo and Arizona. In particular, they plan to grow their international presence more aggressively, especially adding employees to help with customer success and field engineering. Orad also said that he was also open to acquiring companies should the right opportunity come along, saying “Because of talent, technology and presence, it’s something you have to be on lookout for.”

When a company reaches Series E and a couple of hundred million raised, it’s often a point where an exit could be coming sooner than later. By adding an experienced executive like Urlocker, it just emphasizes that possibility, but for now the company appears to be growing and thriving, and taking the view that whatever will be, will be.

Sep
11
2018
--

Twilio’s contact center products just got more analytical with Ytica acquisition

Twilio, a company best known for supplying a communications APIs for developers has a product called Twilio Flex for building sophisticated customer service applications on top of Twilio’s APIs. Today, it announced it was acquiring Ytica (pronounced Why-tica) to provide an operational and analytical layer on top of the customer service solution.

The companies would not discuss the purchase price, but Twilio indicated it does not expect the acquisition to have a material impact on its “results, operations or financial condition.” In other words, it probably didn’t cost much.

Ytica, which is based in Prague, has actually been a partner with Twilio for some time, so coming together in this fashion really made a lot of sense, especially as Twilio has been developing Flex.

Twilio Flex is an app platform for contact centers, which offers a full stack of applications and allows users to deliver customer support over multiple channels, Al Cook, general manager of Twilio Flex explained. “Flex deploys like SaaS, but because it’s built on top of APIs, you can reach in and change how Flex works,” he said. That is very appealing, especially for larger operations looking for a flexible, cloud-based solution without the baggage of on-prem legacy products.

What the product was lacking, however, was a native way to manage customer service representatives from within the application, and understand through analytics and dashboards, how well or poorly the team was doing. Having that ability to measure the effectiveness of the team becomes even more critical the larger the group becomes, and Cook indicated some Flex users are managing enormous groups with 10,000-20,000 employees.

Ytica provides a way to measure the performance of customer service staff, allowing management to monitor and intervene and coach when necessary. “It made so much sense to join together as one team. They have huge experience in the contact center, and a similar philosophy to build something customizable and programmable in the cloud,” Cook said.

While Ytica works with other vendors beyond Twilio, CEO Simon Vostrý says that they will continue to support those customers, even as they join the Twilio family. “We can run Flex and can continue to run this separately. We have customers running on other SaaS platforms, and we will continue to support them,” he said.

The company will remain in Prague and become a Twilio satellite office. All 14 employees are expected to join the Twilio team and Cook says plans are already in the works to expand the Prague team.

Jul
24
2018
--

Outlier raises $6.2 M Series A to change how companies use data

Traditionally, companies have gathered data from a variety of sources, then used spreadsheets and dashboards to try and make sense of it all. Outlier wants to change that and deliver a handful of insights right to your inbox that matter most for your job, company and industry. Today the company announced a $6.2 million Series A to further develop that vision.

The round was led by Ridge Ventures with assistance from 11.2 Capital, First Round Capital, Homebrew, Susa Ventures and SV Angel. The company has raised over $8 million.

The startup is trying to solve a difficult problem around delivering meaningful insight without requiring the customer to ask the right questions. With traditional BI tools, you get your data and you start asking questions and seeing if the data can give you some answers. Outlier wants to bring a level of intelligence and automation by pointing out insight without having to explicitly ask the right question.

Company founder and CEO Sean Byrnes says his previous company, Flurry, helped deliver mobile analytics to customers, but in his travels meeting customers in that previous iteration, he always came up against the same question: “This is great, but what should I look for in all that data?”

It was such a compelling question that after he sold Flurry in 2014 to Yahoo for more than $200 million, that question stuck in the back of his mind and he decided to start a business to solve it. He contends that the first 15 years of BI was about getting answers to basic questions about company performance, but the next 15 will be about finding a way to get the software to ask good questions based on the huge amounts of data.

Byrnes admits that when he launched, he didn’t have much sense of how to put this notion into action, and most people he approached didn’t think it was a great idea. He says he heard “No” from a fair number of investors early on because the artificial intelligence required to fuel a solution like this really wasn’t ready in 2015 when he started the company.

He says that it took four or five iterations to get to today’s product, which lets you connect to various data sources, and using artificial intelligence and machine learning delivers a list of four or five relevant questions to the user’s email inbox that points out data you might not have noticed, what he calls “shifts below the surface.” If you’re a retailer that could be changing market conditions that signal you might want to change your production goals.

Outlier email example. Photo: Outlier

The company launched in 2015. It took some time to polish the product, but today they have 14 employees and 14 customers including Jack Rogers, Celebrity Cruises and Swarovski.

This round should allow them to continuing working to grow the company. “We feel like we hit the right product-market fit because we have customers [generating] reproducible results and really changing the way people use the data,” he said.

Jul
18
2018
--

Swim.ai raises $10M to bring real-time analytics to the edge

Once upon a time, it looked like cloud-based serviced would become the central hub for analyzing all IoT data. But it didn’t quite turn out that way because most IoT solutions simply generate too much data to do this effectively and the round-trip to the data center doesn’t work for applications that have to react in real time. Hence the advent of edge computing, which is spawning its own ecosystem of startups.

Among those is Swim.ai, which today announced that it has raised a $10 million Series B funding round led by Cambridge Innovation Capital, with participation from Silver Creek Ventures and Harris Barton Asset Management. The round also included a strategic investment from Arm, the chip design firm you may still remember as ARM (but don’t write it like that or their PR department will promptly email you). This brings the company’s total funding to about $18 million.

Swim.ai has an interesting take on edge computing. The company’s SWIM EDX product combines both local data processing and analytics with local machine learning. In a traditional approach, the edge devices collect the data, maybe perform some basic operations against the data to bring down the bandwidth cost and then ship it to the cloud where the hard work is done and where, if you are doing machine learning, the models are trained. Swim.ai argues that this doesn’t work for applications that need to respond in real time. Swim.ai, however, performs the model training on the edge device itself by pulling in data from all connected devices. It then builds a digital twin for each one of these devices and uses that to self-train its models based on this data.

“Demand for the EDX software is rapidly increasing, driven by our software’s unique ability to analyze and reduce data, share new insights instantly peer-to-peer – locally at the ‘edge’ on existing equipment. Efficiently processing edge data and enabling insights to be easily created and delivered with the lowest latency are critical needs for any organization,” said Rusty Cumpston, co-founder and CEO of Swim.ai. “We are thrilled to partner with our new and existing investors who share our vision and look forward to shaping the future of real-time analytics at the edge.”

The company doesn’t disclose any current customers, but it is focusing its efforts on manufacturers, service providers and smart city solutions. Update: Swim.ai did tell us about two customers after we published this story: The City of Palo Alto and Itron.

Swim.ai plans to use its new funding to launch a new R&D center in Cambridge, UK, expand its product development team and tackle new verticals and geographies with an expanded sales and marketing team.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com