Apr
16
2021
--

Data scientists: Bring the narrative to the forefront

By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.

However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.

The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.

Data alone doesn’t spur innovation — rather, it’s data-driven storytelling that helps uncover hidden trends, powers personalization, and streamlines processes.

Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”

The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.

Make the abstract more tangible

Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.

For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.

Apr
14
2021
--

PlexTrac raises $10M Series A round for its collaboration-centric security platform

PlexTrac, a Boise, ID-based security service that aims to provide a unified workflow automation platform for red and blue teams, today announced that it has raised a $10 million Series A funding round led by Noro-Moseley Partners and Madrona Venture Group. StageDot0 ventures also participated in this round, which the company plans to use to build out its team and grow its platform.

With this new round, the company, which was founded in 2018, has now raised a total of $11 million, with StageDot0 leading its 2019 seed round.

PlexTrac CEO and President Dan DeCloss

PlexTrac CEO and President Dan DeCloss. Image Credits: PlexTrac

“I have been on both sides of the fence, the specialist who comes in and does the assessment, produces that 300-page report and then comes back a year later to find that some of the critical issues had not been addressed at all. And not because the organization didn’t want to but because it was lost in that report,” PlexTrac CEO and President Dan DeCloss said. “These are some of the most critical findings for an entity from a risk perspective. By making it collaborative, both red and blue teams are united on the same goal we all share, to protect the network and assets.”

With an extensive career in security that included time as a penetration tester for Veracode and the Mayo Clinic, as well as senior information security advisor for Anthem, among other roles, DeCloss has quite a bit of firsthand experience that led him to found PlexTrac. Specifically, he believes that it’s important to break down the wall between offense-focused red teams and defense-centric blue teams.

Image Credits: PlexTrac

“Historically there has been more of the cloak and dagger relationship but those walls are breaking down — and rightfully so, there isn’t that much of that mentality today — people recognize they are on the same mission whether they are an internal security team or an external team,” he said. “With the PlexTrac platform the red and blue teams have a better view into the other teams’ tactics and techniques — and it makes the whole process into an educational exercise for everyone.”

At its core, PlexTrac makes it easier for security teams to produce their reports — and hence free them up to actually focus on “real” security work. To do so, the service integrates with most of the popular scanners like Qualys, and Veracode, but also tools like ServiceNow and Jira in order to help teams coordinate their workflows. All the data flows into real-time reports that then help teams monitor their security posture. The service also features a dedicated tool, WriteupsDB, for managing reusable write-ups to help teams deliver consistent reports for a variety of audiences.

“Current tools for planning, executing and reporting on security testing workflows are either nonexistent (manual reporting, spreadsheets, documents, etc. …) or exist as largely incomplete features of legacy platforms,” Madrona’s S. Somasegar and Chris Picardo write in today’s announcement. “The pain point for security teams is real and PlexTrac is able to streamline their workflows, save time, and greatly improve output quality. These teams are on the leading edge of attempting to find and exploit vulnerabilities (red teams) and defend and/or eliminate threats (blue teams).”

 

Apr
13
2021
--

Meroxa raises $15M Series A for its real-time data platform

Meroxa, a startup that makes it easier for businesses to build the data pipelines to power both their analytics and operational workflows, today announced that it has raised a $15 million Series A funding round led by Drive Capital. Existing investors Root, Amplify and Hustle Fund also participated in this round, which together with the company’s previously undisclosed $4.2 million seed round now brings total funding in the company to $19.2 million.

The promise of Meroxa is that businesses can use a single platform for their various data needs and won’t need a team of experts to build their infrastructure and then manage it. At its core, Meroxa provides a single software-as-a-service solution that connects relational databases to data warehouses and then helps businesses operationalize that data.

Image Credits: Meroxa

“The interesting thing is that we are focusing squarely on relational and NoSQL databases into data warehouse,” Meroxa co-founder and CEO DeVaris Brown told me. “Honestly, people come to us as a real-time FiveTran or real-time data warehouse sink. Because, you know, the industry has moved to this [extract, load, transform] format. But the beautiful part about us is, because we do change data capture, we get that granular data as it happens.” And businesses want this very granular data to be reflected inside of their data warehouses, Brown noted, but he also stressed that Meroxa can expose this stream of data as an API endpoint or point it to a Webhook.

The company is able to do this because its core architecture is somewhat different from other data pipeline and integration services that, at first glance, seem to offer a similar solution. Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools on top of these data streams.

Image Credits: Meroxa

“We aren’t a point-to-point solution,” Meroxa co-founder and CTO Ali Hamidi explained. “When you set up the connection, you aren’t taking data from Postgres and only putting it into Snowflake. What’s really happening is that it’s going into our intermediate stream. Once it’s in that stream, you can then start hanging off connectors and say, ‘Okay, well, I also want to peek into the stream, I want to transfer my data, I want to filter out some things, I want to put it into S3.’ ”

Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools to utilize the real-time data stream. With this flexibility, Hamidi noted, a lot of the company’s customers start with a pretty standard use case and then quickly expand into other areas as well.

Brown and Hamidi met during their time at Heroku, where Brown was a director of product management and Hamidi a lead software engineer. But while Heroku made it very easy for developers to publish their web apps, there wasn’t anything comparable in the highly fragmented database space. The team acknowledges that there are a lot of tools that aim to solve these data problems, but few of them focus on the user experience.

Image Credits: Meroxa

“When we talk to customers now, it’s still very much an unsolved problem,” Hamidi said. “It seems kind of insane to me that this is such a common thing and there is no ‘oh, of course you use this tool because it addresses all my problems.’ And so the angle that we’re taking is that we see user experience not as a nice-to-have, it’s really an enabler, it is something that enables a software engineer or someone who isn’t a data engineer with 10 years of experience in wrangling Kafka and Postgres and all these things. […] That’s a transformative kind of change.”

It’s worth noting that Meroxa uses a lot of open-source tools but the company has also committed to open-sourcing everything in its data plane as well. “This has multiple wins for us, but one of the biggest incentives is in terms of the customer, we’re really committed to having our agenda aligned. Because if we don’t do well, we don’t serve the customer. If we do a crappy job, they can just keep all of those components and run it themselves,” Hamidi explained.

Today, Meroxa, which the team founded in early 2020, has more than 24 employees (and is 100% remote). “I really think we’re building one of the most talented and most inclusive teams possible,” Brown told me. “Inclusion and diversity are very, very high on our radar. Our team is 50% black and brown. Over 40% are women. Our management team is 90% underrepresented. So not only are we building a great product, we’re building a great company, we’re building a great business.”  

Apr
08
2021
--

Immersion cooling to offset data centers’ massive power demands gains a big booster in Microsoft

LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.

Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.

The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog. 

While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.

As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.

It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.

“Air cooling is not enough”

More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.

“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”

For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level

The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).

Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. 

Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.

For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.

“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”

At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.

“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.

Solutions under the sea

If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.

Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.

These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.

The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people. 

In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.

Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).

Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.

With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.

Ioannis Manousakis, a principal software engineer with Azure (left), and Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development (right), walk past a container at a Microsoft datacenter where computer servers in a two-phase immersion cooling tank are processing workloads. Photo by Gene Twedt for Microsoft.

Apr
06
2021
--

Esri brings its flagship ArcGIS platform to Kubernetes

Esri, the geographic information system (GIS), mapping and spatial analytics company, is hosting its (virtual) developer summit today. Unsurprisingly, it is making a couple of major announcements at the event that range from a new design system and improved JavaScript APIs to support for running ArcGIS Enterprise in containers on Kubernetes.

The Kubernetes project was a major undertaking for the company, Esri Product Managers Trevor Seaton and Philip Heede told me. Traditionally, like so many similar products, ArcGIS was architected to be installed on physical boxes, virtual machines or cloud-hosted VMs. And while it doesn’t really matter to end-users where the software runs, containerizing the application means that it is far easier for businesses to scale their systems up or down as needed.

Esri ArcGIS Enterprise on Kubernetes deployment

Esri ArcGIS Enterprise on Kubernetes deployment. Image Credits: Esri

“We have a lot of customers — especially some of the larger customers — that run very complex questions,” Seaton explained. “And sometimes it’s unpredictable. They might be responding to seasonal events or business events or economic events, and they need to understand not only what’s going on in the world, but also respond to their many users from outside the organization coming in and asking questions of the systems that they put in place using ArcGIS. And that unpredictable demand is one of the key benefits of Kubernetes.”

Deploying Esri ArcGIS Enterprise on Kubernetes

Deploying Esri ArcGIS Enterprise on Kubernetes. Image Credits: Esri

The team could have chosen to go the easy route and put a wrapper around its existing tools to containerize them and call it a day, but as Seaton noted, Esri used this opportunity to re-architect its tools and break it down into microservices.

“It’s taken us a while because we took three or four big applications that together make up [ArcGIS] Enterprise,” he said. “And we broke those apart into a much larger set of microservices. That allows us to containerize specific services and add a lot of high availability and resilience to the system without adding a lot of complexity for the administrators — in fact, we’re reducing the complexity as we do that and all of that gets installed in one single deployment script.”

While Kubernetes simplifies a lot of the management experience, a lot of companies that use ArcGIS aren’t yet familiar with it. And as Seaton and Heede noted, the company isn’t forcing anyone onto this platform. It will continue to support Windows and Linux just like before. Heede also stressed that it’s still unusual — especially in this industry — to see a complex, fully integrated system like ArcGIS being delivered in the form of microservices and multiple containers that its customers then run on their own infrastructure.

Image Credits: Esri

In addition to the Kubernetes announcement, Esri also today announced new JavaScript APIs that make it easier for developers to create applications that bring together Esri’s server-side technology and the scalability of doing much of the analysis on the client-side. Back in the day, Esri would support tools like Microsoft’s Silverlight and Adobe/Apache Flex for building rich web-based applications. “Now, we’re really focusing on a single web development technology and the toolset around that,” Esri product manager Julie Powell told me.

A bit later this month, Esri also plans to launch its new design system to make it easier and faster for developers to create clean and consistent user interfaces. This design system will launch April 22, but the company already provided a bit of a teaser today. As Powell noted, the challenge for Esri is that its design system has to help the company’s partners put their own style and branding on top of the maps and data they get from the ArcGIS ecosystem.

 

Apr
06
2021
--

Google Cloud joins the FinOps Foundation

Google Cloud today announced that it is joining the FinOps Foundation as a Premier Member.

The FinOps Foundation is a relatively new open-source foundation, hosted by the Linux Foundation, that launched last year. It aims to bring together companies in the “cloud financial management” space to establish best practices and standards. As the term implies, “cloud financial management” is about the tools and practices that help businesses manage and budget their cloud spend. There’s a reason, after all, that there are a number of successful startups that do nothing else but help businesses optimize their cloud spend (and ideally lower it).

Maybe it’s no surprise that the FinOps Foundation was born out of Cloudability’s quarterly Customer Advisory Board meetings. Until now, CloudHealth by VMware was the Foundation’s only Premiere Member among its vendor members. Other members include Cloudability, Densify, Kubecost and SoftwareOne. With Google Cloud, the Foundation has now signed up its first major cloud provider.

“FinOps best practices are essential for companies to monitor, analyze and optimize cloud spend across tens to hundreds of projects that are critical to their business success,” said Yanbing Li, vice president of Engineering and Product at Google Cloud. “More visibility, efficiency and tools will enable our customers to improve their cloud deployments and drive greater business value. We are excited to join FinOps Foundation, and together with like-minded organizations, we will shepherd behavioral change throughout the industry.”

Google Cloud has already committed to sending members to some of the Foundation’s various Special Interest Groups (SIGs) and Working Groups to “help drive open-source standards for cloud financial management.”

“The practitioners in the FinOps Foundation greatly benefit when market leaders like Google Cloud invest resources and align their product offerings to FinOps principles and standards,” said J.R. Storment, executive director of the FinOps Foundation. “We are thrilled to see Google Cloud increase its commitment to the FinOps Foundation, joining VMware as the second of three dedicated Premier Member Technical Advisory Council seats.”

Mar
29
2021
--

Testing platform Tricentis acquires performance testing service Neotys

If you develop software for a large enterprise company, chances are you’ve heard of Tricentis. If you don’t develop software for a large enterprise company, chances are you haven’t. The software testing company with a focus on modern cloud and enterprise applications was founded in Austria in 2007 and grew from a small consulting firm to a major player in this field, with customers like Allianz, BMW, Starbucks, Deutsche Bank, Toyota and UBS. In 2017, the company raised a $165 million Series B round led by Insight Venture Partners.

Today, Tricentis announced that it has acquired Neotys, a popular performance testing service with a focus on modern enterprise applications and a tests-as-code philosophy. The two companies did not disclose the price of the acquisition. France-based Neotys launched in 2005 and raised about €3 million before the acquisition. Today, it has about 600 customers for its NeoLoad platform. These include BNP Paribas, Dell, Lufthansa, McKesson and TechCrunch’s own corporate parent, Verizon.

As Tricentis CEO Sandeep Johri noted, testing tools were traditionally script-based, which also meant they were very fragile whenever an application changed. Early on, Tricentis introduced a low-code tool that made the automation process both easier and resilient. Now, as even traditional enterprises move to DevOps and release code at a faster speed than ever before, testing is becoming both more important and harder for these companies to implement.

“You have to have automation and you cannot have it be fragile, where it breaks, because then you spend as much time fixing the automation as you do testing the software,” Johri said. “Our core differentiator was the fact that we were a low-code, model-based automation engine. That’s what allowed us to go from $6 million in recurring revenue eight years ago to $200 million this year.”

Tricentis, he added, wants to be the testing platform of choice for large enterprises. “We want to make sure we do everything that a customer would need, from a testing perspective, end to end. Automation, test management, test data, test case design,” he said.

The acquisition of Neotys allows the company to expand this portfolio by adding load and performance testing as well. It’s one thing to do the standard kind of functional testing that Tricentis already did before launching an update, but once an application goes into production, load and performance testing becomes critical as well.

“Before you put it into production — or before you deploy it — you need to make sure that your application not only works as you expect it, you need to make sure that it can handle the workload and that it has acceptable performance,” Johri noted. “That’s where load and performance testing comes in and that’s why we acquired Neotys. We have some capability there, but that was primarily focused on the developers. But we needed something that would allow us to do end-to-end performance testing and load testing.”

The two companies already had an existing partnership and had integrated their tools before the acquisition — and many of its customers were already using both tools, too.

“We are looking forward to joining Tricentis, the industry leader in continuous testing,” said Thibaud Bussière, president and co-founder at Neotys. “Today’s Agile and DevOps teams are looking for ways to be more strategic and eliminate manual tasks and implement automated solutions to work more efficiently and effectively. As part of Tricentis, we’ll be able to eliminate laborious testing tasks to allow teams to focus on high-value analysis and performance engineering.”

NeoLoad will continue to exist as a stand-alone product, but users will likely see deeper integrations with Tricentis’ existing tools over time, include Tricentis Analytics, for example.

Johri tells me that he considers Tricentis one of the “best kept secrets in Silicon Valley” because the company not only started out in Europe (even though its headquarters is now in Silicon Valley) but also because it hasn’t raised a lot of venture rounds over the years. But that’s very much in line with Johri’s philosophy of building a company.

“A lot of Silicon Valley tends to pay attention only when you raise money,” he told me. “I actually think every time you raise money, you’re diluting yourself and everybody else. So if you can succeed without raising too much money, that’s the best thing. We feel pretty good that we have been very capital efficient and now we’re recognized as a leader in the category — which is a huge category with $30 billion spend in the category. So we’re feeling pretty good about it.”

Mar
18
2021
--

Slapdash raises $3.7M seed to ship a workplace apps command bar

The explosion in productivity software amid a broader remote work boom has been one of the pandemic’s clearest tech impacts. But learning to use a dozen new programs while having to decipher which data is hosted where can sometimes seem to have an adverse effect on worker productivity. It’s all time that users can take for granted, even when carrying out common tasks like navigating to the calendar to view more info to click a link to open the browser to redirect to the native app to open a Zoom call.

Slapdash is aiming to carve a new niche out for itself among workplace software tools, pushing a desire for peak performance to the forefront with a product that shaves seconds off each instance where a user needs to find data hosted in a cloud app or carry out an action. While most of the integration-heavy software suites to emerge during the remote work boom have focused on promoting visibility or re-skinning workflows across the tangled weave of SaaS apps, Slapdash founder Ivan Kanevski hopes that the company’s efforts to engineer a quicker path to information will push tech workers to integrate another tool into their workflow.

The team tells TechCrunch that they’ve raised $3.7 million in seed funding from investors that include S28 Capital, Quiet Capital, Quarry Ventures and Twenty Two Ventures. Angels participating in the round include co-founders at companies like Patreon, Docker and Zynga.

Image Credits: Slapdash

Kanevski says the team sought to emulate the success of popular apps like Superhuman, which have pushed low-latency command line interface navigation while emulating some of the sleek internal tools used at companies like Facebook, where he spent nearly six years as a software engineer.

Slapdash’s command line widget can be pulled up anywhere, once installed, with a quick keyboard shortcut. From there, users can search through a laundry list of indexable apps including Slack, Zoom, Jira and about 20 others. Beyond command line access, users can create folders of files and actions inside the full desktop app or create their own keyboard shortcuts to quickly hammer out a task. The app is available on Mac, Windows, Linux and the web.

“We’re not trying to displace the applications that you connect to Slapdash,” he says. “You won’t see us, for example, building document editing, you won’t see us building project management, just because our sort of philosophy is that we’re a neutral platform.”

The company offers a free tier for users indexing up to five apps and creating 10 commands and spaces; any more than that and you level up into a $12 per month paid plan. Things look more customized for enterprise-wide pricing. As the team hopes to make the tool essential to startups, Kanevski sees the app’s hefty utility for individual users as a clear asset in scaling up.

“If you anticipate rolling this out to larger organizations, you would want the people that are using the software to have a blast with it,” he says. “We have quite a lot of confidence that even at this sort of individual atomic level, we built something pretty joyful and helpful.”

Mar
17
2021
--

Amazon will expand its Amazon Care on-demand healthcare offering U.S.-wide this summer

Amazon is apparently pleased with how its Amazon Care pilot in Seattle has gone, since it announced this morning that it will be expanding the offering across the U.S. this summer, and opening it up to companies of all sizes, in addition to its own employees. The Amazon Care model combines on-demand and in-person care, and is meant as a solution from the search giant to address shortfalls in current offering for employer-sponsored healthcare offerings.

In a blog post announcing the expansion, Amazon touted the speed of access to care made possible for its employees and their families via the remote, chat and video-based features of Amazon Care. These are facilitated via a dedicated Amazon Care app, which provides direct, live chats via a nurse or doctor. Issues that then require in-person care is then handled via a house call, so a medical professional is actually sent to your home to take care of things like administering blood tests or doing a chest exam, and prescriptions are delivered to your door as well.

The expansion is being handled differently across both in-person and remote variants of care; remote services will be available starting this summer to both Amazon’s own employees, as well as other companies who sign on as customers, starting this summer. The in-person side will be rolling out more slowly, starting with availability in Washington, D.C., Baltimore, and “other cities in the coming months” according to the company.

As of today, Amazon Care is expanding in its home state of Washington to begin serving other companies. The idea is that others will sing on to make Amazon Care part of its overall benefits package for employees. Amazon is touting the speed advantages of testing services, including results delivery, for things including COVID-19 as a major strength of the service.

The Amazon Care model has a surprisingly Amazon twist, too – when using the in-person care option, the app will provide an updating ETA for when to expect your physician or medical technician, which is eerily similar to how its primary app treats package delivery.

While the Amazon Care pilot in Washington only launched a year-and-a-half ago, the company has had its collective mind set on upending the corporate healthcare industry for some time now. It announced a partnership with Berkshire Hathaway and JPMorgan back at the very beginning of 2018 to form a joint venture specifically to address the gaps they saw in the private corporate healthcare provider market.

That deep pocketed all-star team ended up officially disbanding at the outset of this year, after having done a whole lot of not very much in the three years in between. One of the stated reasons that Amazon and its partners gave for unpartnering was that each had made a lot of progress on its own in addressing the problems it had faced anyway. While Berkshire Hathaway and JPMorgan’s work in that regard might be less obvious, Amazon was clearly referring to Amazon Care.

It’s not unusual for large tech companies with lots of cash on the balance sheet and a need to attract and retain top-flight talent to spin up their own healthcare benefits for their workforces. Apple and Google both have their own on-campus wellness centers staffed by medical professionals, for instance. But Amazon’s ambitious have clearly exceeded those of its peers, and it looks intent on making a business line out of the work it did to improve its own employee care services — a strategy that isn’t too dissimilar from what happened with AWS, by the way.

Mar
16
2021
--

Noogata raises $12M seed round for its no-code enterprise AI platform

Noogata, a startup that offers a no-code AI solution for enterprises, today announced that it has raised a $12 million seed round led by Team8, with participation from Skylake Capital. The company, which was founded in 2019 and counts Colgate and PepsiCo among its customers, currently focuses on e-commerce, retail and financial services, but it notes that it will use the new funding to power its product development and expand into new industries.

The company’s platform offers a collection of what are essentially pre-built AI building blocks that enterprises can then connect to third-party tools like their data warehouse, Salesforce, Stripe and other data sources. An e-commerce retailer could use this to optimize its pricing, for example, thanks to recommendations from the Noogata platform, while a brick-and-mortar retailer could use it to plan which assortment to allocate to a given location.

Image Credits: Noogata

“We believe data teams are at the epicenter of digital transformation and that to drive impact, they need to be able to unlock the value of data. They need access to relevant, continuous and explainable insights and predictions that are reliable and up-to-date,” said Noogata co-founder and CEO Assaf Egozi. “Noogata unlocks the value of data by providing contextual, business-focused blocks that integrate seamlessly into enterprise data environments to generate actionable insights, predictions and recommendations. This empowers users to go far beyond traditional business intelligence by leveraging AI in their self-serve analytics as well as in their data solutions.”

Image Credits: Noogata

We’ve obviously seen a plethora of startups in this space lately. The proliferation of data — and the advent of data warehousing — means that most businesses now have the fuel to create machine learning-based predictions. What’s often lacking, though, is the talent. There’s still a shortage of data scientists and developers who can build these models from scratch, so it’s no surprise that we’re seeing more startups that are creating no-code/low-code services in this space. The well-funded Abacus.ai, for example, targets about the same market as Noogata.

“Noogata is perfectly positioned to address the significant market need for a best-in-class, no-code data analytics platform to drive decision-making,” writes Team8 managing partner Yuval Shachar. “The innovative platform replaces the need for internal build, which is complex and costly, or the use of out-of-the-box vendor solutions which are limited. The company’s ability to unlock the value of data through AI is a game-changer. Add to that a stellar founding team, and there is no doubt in my mind that Noogata will be enormously successful.”


Early Stage is the premier “how-to” event for startup entrepreneurs and investors. You’ll hear firsthand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company building: Fundraising, recruiting, sales, product-market fit, PR, marketing and brand building. Each session also has audience participation built-in — there’s ample time included for audience questions and discussion. Use code “TCARTICLE at checkout to get 20% off tickets right here.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com