Sep
18
2020
--

SaaS Ventures takes the investment road less traveled

Most venture capital firms are based in hubs like Silicon Valley, New York City and Boston. These firms nurture those ecosystems and they’ve done well, but SaaS Ventures decided to go a different route: it went to cities like Chicago, Green Bay, Wisconsin and Lincoln, Nebraska.

The firm looks for enterprise-focused entrepreneurs who are trying to solve a different set of problems than you might find in these other centers of capital, issues that require digital solutions but might fall outside a typical computer science graduate’s experience.

Saas Ventures looks at four main investment areas: trucking and logistics, manufacturing, e-commerce enablement for industries that have not typically gone online and cybersecurity, the latter being the most mainstream of the areas SaaS Ventures covers.

The company’s first fund, which launched in 2017, was worth $20 million, but SaaS Ventures launched a second fund of equal amount earlier this month. It tends to stick to small-dollar-amount investments, while partnering with larger firms when it contributes funds to a deal.

We talked to Collin Gutman, founder and managing partner at SaaS Ventures, to learn about his investment philosophy, and why he decided to take the road less traveled for his investment thesis.

A different investment approach

Gutman’s journey to find enterprise startups in out of the way places began in 2012 when he worked at an early enterprise startup accelerator called Acceleprise. “We were really the first ones who said enterprise tech companies are wired differently, and need a different set of early-stage resources,” Gutman told TechCrunch.

Through that experience, he decided to launch SaaS Ventures in 2017, with several key ideas underpinning the firm’s investment thesis: after his experience at Acceleprise, he decided to concentrate on the enterprise from a slightly different angle than most early-stage VC establishments.

Collin Gutman from SaaS Ventures

Collin Gutman, founder and managing partner at SaaS Ventures (Image Credits: SaaS Ventures)

The second part of his thesis was to concentrate on secondary markets, which meant looking beyond the popular startup ecosystem centers and investing in areas that didn’t typically get much attention. To date, SaaS Ventures has made investments in 23 states and Toronto, seeking startups that others might have overlooked.

“We have really phenomenal coverage in terms of not just geography, but in terms of what’s happening with the underlying businesses, as well as their customers,” Gutman said. He believes that broad second-tier market data gives his firm an upper hand when selecting startups to invest in. More on that later.

Sep
16
2020
--

Narrator raises $6.2M for a new approach to data modelling that replaces star schema

Snowflake went public this week, and in a mark of the wider ecosystem that is evolving around data warehousing, a startup that has built a completely new concept for modelling warehoused data is announcing funding. Narrator — which uses an 11-column ordering model rather than standard star schema to organise data for modelling and analysis — has picked up a Series A round of $6.2 million, money that it plans to use to help it launch and build up users for a self-serve version of its product.

The funding is being led by Initialized Capital along with continued investment from Flybridge Capital Partners and Y Combinator — where the startup was in a 2019 cohort — as well as new investors, including Paul Buchheit.

Narrator has been around for three years, but its first phase was based around providing modelling and analytics directly to companies as a consultancy, helping companies bring together disparate, structured data sources from marketing, CRM, support desks and internal databases to work as a unified whole. As consultants, using an earlier build of the tool that it’s now launching, the company’s CEO Ahmed Elsamadisi said he and others each juggled queries “for eight big companies single-handedly,” while deep-dive analyses were done by another single person.

Having validated that it works, the new self-serve version aims to give data scientists and analysts a simplified way of ordering data so that queries, described as actionable analyses in a story-like format — or “Narratives,” as the company calls them — can be made across that data quickly — hours rather than weeks — and consistently. (You can see a demo of how it works below provided by the company’s head of data, Brittany Davis.)

The new data-as-a-service is also priced in SaaS tiers, with a free tier for the first 5 million rows of data, and a sliding scale of pricing after that based on data rows, user numbers and Narratives in use.

Image Credits: Narrator

Elsamadisi, who co-founded the startup with Matt Star, Cedric Dussud and Michael Nason, said that data analysts have long lived with the problems with star schema modelling (and by extension the related format of snowflake schema), which can be summed up as “layers of dependencies, lack of source of truth, numbers not matching and endless maintenance,” he said.

“At its core, when you have lots of tables built from lots of complex SQL, you end up with a growing house of cards requiring the need to constantly hire more people to help make sure it doesn’t collapse.”

(We)Work Experience

It was while he was working as lead data scientist at WeWork — yes, he told me, maybe it wasn’t actually a tech company, but it had “tech at its core” — that he had a breakthrough moment of realising how to restructure data to get around these issues.

Before that, things were tough on the data front. WeWork had 700 tables that his team was managing using a star schema approach, covering 85 systems and 13,000 objects. Data would include information on acquiring buildings, to the flows of customers through those buildings, how things would change and customers might churn, with marketing and activity on social networks, and so on, growing in line with the company’s own rapidly scaling empire.  All of that meant a mess at the data end.

“Data analysts wouldn’t be able to do their jobs,” he said. “It turns out we could barely even answer basic questions about sales numbers. Nothing matched up, and everything took too long.”

The team had 45 people on it, but even so it ended up having to implement a hierarchy for answering questions, as there were so many and not enough time to dig through and answer them all. “And we had every data tool there was,” he added. “My team hated everything they did.”

The single-table column model that Narrator uses, he said, “had been theorised” in the past but hadn’t been figured out.

The spark, he said, was to think of data structured in the same way that we ask questions, where — as he described it — each piece of data can be bridged together and then also used to answer multiple questions.

“The main difference is we’re using a time-series table to replace all your data modelling,” Elsamadisi explained. “This is not a new idea, but it was always considered impossible. In short, we tackle the same problem as most data companies to make it easier to get the data you want but we are the only company that solves it by innovating on the lowest-level data modelling approach. Honestly, that is why our solution works so well. We rebuilt the foundation of data instead of trying to make a faulty foundation better.”

Narrator calls the composite table, which includes all of your data reformatted to fit in its 11-column structure, the Activity Stream.

Elsamadisi said using Narrator for the first time takes about 30 minutes, and about a month to learn to use it thoroughly. “But you’re not going back to SQL after that, it’s so much faster,” he added.

Narrator’s initial market has been providing services to other tech companies, and specifically startups, but the plan is to open it up to a much wider set of verticals. And in a move that might help with that, longer term, it also plans to open source some of its core components so that third parties can build data products on top of the framework more quickly.

As for competitors, he says that it’s essentially the tools that he and other data scientists have always used, although “we’re going against a ‘best practice’ approach (star schema), not a company.” Airflow, DBT, Looker’s LookML, Chartio’s Visual SQL, Tableau Prep are all ways to create and enable the use of a traditional star schema, he added. “We’re similar to these companies — trying to make it as easy and efficient as possible to generate the tables you need for BI, reporting and analysis — but those companies are limited by the traditional star schema approach.”

So far the proof has been in the data. Narrator says that companies average around 20 transformations (the unit used to answer questions) compared to hundreds in a star schema, and that those transformations average 22 lines compared to 1,000+ lines in traditional modelling. For those that learn how to use it, the average time for generating a report or running some analysis is four minutes, compared to weeks in traditional data modelling. 

“Narrator has the potential to set a new standard in data,” said Jen Wolf, ?Initialized Capital COO and partner and new Narrator board member?, in a statement. “We were amazed to see the quality and speed with which Narrator delivered analyses using their product. We’re confident once the world experiences Narrator this will be how data analysis is taught moving forward.”

Sep
16
2020
--

Luther.AI is a new AI tool that acts like Google for personal conversations

When it comes to pop culture, a company executive or history questions, most of us use Google as a memory crutch to recall information we can’t always keep in our heads, but Google can’t help you remember the name of your client’s spouse or the great idea you came up with at a meeting the other day.

Enter Luther.AI, which purports to be Google for your memory by capturing and transcribing audio recordings, while using AI to deliver the right information from your virtual memory bank in the moment of another online conversation or via search.

The company is releasing an initial browser-based version of their product this week at TechCrunch Disrupt where it’s competing for the $100,000 prize at TechCrunch Disrupt Battlefield.

Luther.AI’s founders say the company is built on the premise that human memory is fallible, and that weakness limits our individual intelligence. The idea behind Luther.AI is to provide a tool to retain, recall and even augment our own brains.

It’s a tall order, but the company’s founders believe it’s possible through the growing power of artificial intelligence and other technologies.

“It’s made possible through a convergence of neuroscience, NLP and blockchain to deliver seamless in-the-moment recall. GPT-3 is built on the memories of the public internet, while Luther is built on the memories of your private self,” company founder and CEO Suman Kanuganti told TechCrunch.

It starts by recording your interactions throughout the day. For starters, that will be online meetings in a browser, as we find ourselves in a time where that is the way we interact most often. Over time though, they envision a high-quality 5G recording device you wear throughout your day at work and capture your interactions.

If that is worrisome to you from a privacy perspective, Luther is building in a few safeguards starting with high-end encryption. Further, you can only save other parties’ parts of a conversation with their explicit permission. “Technologically, we make users the owner of what they are speaking. So for example, if you and I are having a conversation in the physical world unless you provide explicit permission, your memories are not shared from this particular conversation with me,” Kanuganti explained.

Finally, each person owns their own data in Luther and nobody else can access or use these conversations either from Luther or any other individual. They will eventually enforce this ownership using blockchain technology, although Kanuganti says that will be added in a future version of the product.

Luther.ai search results recalling what person said at meeting the other day about customer feedback.

Image Credits: Luther.ai

Kanuganti says the true power of the product won’t be realized with a few individuals using the product inside a company, but in the network effect of having dozens or hundreds of people using it, even though it will have utility even for an individual to help with memory recall, he said.

While they are releasing the browser-based product this week, they will eventually have a stand-alone app, and can also envision other applications taking advantage of the technology in the future via an API where developers can build Luther functionality into other apps.

The company was founded at the beginning of this year by Kanuganti and three co-founders including CTO Sharon Zhang, design director Kristie Kaiser and scientist Marc Ettlinger . It has raised $500,000 and currently has 14 employees including the founders.

Sep
16
2020
--

Pure Storage acquires data service platform Portworx for $370M

Pure Storage, the public enterprise data storage company, today announced that it has acquired Portworx, a well-funded startup that provides a cloud-native storage and data-management platform based on Kubernetes, for $370 million in cash. This marks Pure Storage’s largest acquisition to date and shows how important this market for multicloud data services has become.

Current Portworx enterprise customers include the likes of Carrefour, Comcast, GE Digital, Kroger, Lufthansa, and T-Mobile. At the core of the service is its ability to help users migrate their data and create backups. It creates a storage layer that allows developers to then access that data, no matter where it resides.

Pure Storage will use Portworx’s technology to expand its hybrid and multicloud services and provide Kubernetes -based data services across clouds.

Image Credits: Portworx

“I’m tremendously proud of what we’ve built at Portworx: An unparalleled data services platform for customers running mission-critical applications in hybrid and multicloud environments,” said Portworx CEO Murli Thirumale. “The traction and growth we see in our business daily shows that containers and Kubernetes are fundamental to the next-generation application architecture and thus competitiveness. We are excited for the accelerated growth and customer impact we will be able to achieve as a part of Pure.”

When the company raised its Series C round last year, Thirumale told me that Portworx had expanded its customer base by over 100% and its bookings increased by 376 from 2018 to 2019.

“As forward-thinking enterprises adopt cloud-native strategies to advance their business, we are thrilled to have the Portworx team and their groundbreaking technology joining us at Pure to expand our success in delivering multicloud data services for Kubernetes,” said Charles Giancarlo, chairman and CEO of Pure Storage. “This acquisition marks a significant milestone in expanding our Modern Data Experience to cover traditional and cloud native applications alike.”

Sep
15
2020
--

In 2020, Warsaw’s startup ecosystem is ‘a place to observe carefully’

If you listed the trends that have captured the attention of 20 Warsaw-focused investors who replied to our recent surveys, automation/AI, enterprise SaaS, cleantech, health, remote work and the sharing economy would top the list. These VCs said they are seeking opportunities in the “digital twin” space, proptech and expanded blockchain tokenization inside industries.

Investors in Central and Eastern Europe are generally looking for the same things as VCs based elsewhere: startups that have a unique value proposition, capital efficiency, motivated teams, post-revenue and a well-defined market niche.

Out of the cohort we interviewed, several told us that COVID-19 had not yet substantially transformed how they do business. As Micha? Papuga, a partner at Flashpoint VC put it, “the situation since March hasn’t changed a lot, but we went from extreme panic to extreme bullishness. Neither of these is good and I would recommend to stick to the long-term goals and not to be pressured.”

Said Pawel Lipkowski of RBL_VC, “Warsaw is at its pivotal point — think Berlin in the ‘90s. It’s a place to observe carefully.”

Here’s who we interviewed for part one:

For the conclusion, we spoke to the following investors:

Karol Szubstarski, partner, OTB Ventures

What trends are you most excited about investing in, generally?
Gradual shift of enterprises toward increased use of automation and AI, that enables dramatic improvement of efficiency, cost reduction and transfer of enterprise resources from tedious, repeatable and mundane tasks to more exciting, value added opportunities.

What’s your latest, most exciting investment?
One of the most exciting opportunities is ICEYE. The company is a leader and first mover in synthetic-aperture radar (SAR) technology for microsatellites. It is building and operating its own commercial constellation of SAR microsatellites capable of providing satellite imagery regardless of the cloud cover, weather conditions and time of the day and night (comparable resolution to traditional SAR satellites with 100x lower cost factor), which is disrupting the multibillion dollar satellite imagery market.

Are there startups that you wish you would see in the industry but don’t? What are some overlooked opportunities right now?
I would love to see more startups in the digital twin space; technology that enables creation of an exact digital replica/copy of something in physical space — a product, process or even the whole ecosystem. This kind of solution enables experiments and [the implementation of] changes that otherwise could be extremely costly or risky – it can provide immense value added for customers.

What are you looking for in your next investment, in general?
A company with unique value proposition to its customers, deep tech component that provides competitive edge over other players in the market and a founder with global vision and focus on execution of that vision.

Which areas are either oversaturated or would be too hard to compete in at this point for a new startup? What other types of products/services are you wary or concerned about?
No market/sector is too saturated and has no room for innovation. Some markets seem to be more challenging than others due to immense competitive landscape (e.g., food delivery, language-learning apps) but still can be the subject of disruption due to a unique value proposition of a new entrant.

How much are you focused on investing in your local ecosystem versus other startup hubs (or everywhere) in general? More than 50%? Less?
OTB is focused on opportunities with links to Central Eastern European talent (with no bias toward any hub in the region), meaning companies that leverage local engineering/entrepreneurial talent in order to build world-class products to compete globally (usually HQ outside CEE).

Which industries in your city and region seem well-positioned to thrive, or not, long term? What are companies you are excited about (your portfolio or not), which founders?
CEE region is recognized for its sizable and highly skilled talent pool in the fields of engineering and software development. The region is well-positioned to build up solutions that leverage deep, unique tech regardless of vertical (especially B2B). Historically, the region was especially strong in AI/ML, voice/speech/NLP technologies, cybersecurity, data analytics, etc.

How should investors in other cities think about the overall investment climate and opportunities in your city?
CEE (including Poland and Warsaw) has always been recognized as an exceptionally strong region in terms of engineering/IT talent. Inherent risk aversion of entrepreneurs has driven, for a number of years, a more “copycat”/local market approach, while holding back more ambitious, deep tech opportunities. In recent years we are witnessing a paradigm shift with a new generation of entrepreneurs tackling problems with unique, deep tech solutions, putting emphasis on global expansion, neglecting shallow local markets. As such, the quality of deals has been steadily growing and currently reflects top quality on global scale, especially on tech level. CEE market demonstrates also a growing number of startups (in total), which is mostly driven by an abundance of early-stage capital and success stories in the region (e.g., DataRobot, Bolt, UiPath) that are successfully evangelizing entrepreneurship among corporates/engineers.

Do you expect to see a surge in more founders coming from geographies outside major cities in the years to come, with startup hubs losing people due to the pandemic and lingering concerns, plus the attraction of remote work?
I believe that local hubs will hold their dominant position in the ecosystem. The remote/digital workforce will grow in numbers but proximity to capital, human resources and markets still will remain the prevalent force in shaping local startup communities.

Which industry segments that you invest in look weaker or more exposed to potential shifts in consumer and business behavior because of COVID-19? What are the opportunities startups may be able to tap into during these unprecedented times?
OTB invests in general in companies with clearly defined technological advantage, making quantifiable and near-term difference to their customers (usually in the B2B sector), which is a value-add regardless of the market cycle. The economic downturn works generally in favor of technological solutions enabling enterprise clients to increase efficiency, cut costs, bring optimization and replace manual labour with automation — and the vast majority of OTB portfolio fits that description. As such, the majority of the OTB portfolio has not been heavily impacted by the COVID pandemic.

How has COVID-19 impacted your investment strategy? What are the biggest worries of the founders in your portfolio? What is your advice to startups in your portfolio right now?
The COVID pandemic has not impacted our investment strategy in any way. OTB still pursues unique tech opportunities that can provide its customers with immediate value added. This kind of approach provides a relatively high level of resilience against economic downturns (obviously, sales cycles are extending but in general sales pipeline/prospects/retention remains intact). Liquidity in portfolio is always the number one concern in uncertain, challenging times. Lean approach needs to be reintroduced, companies need to preserve cash and keep optimizing — that’s the only way to get through the crisis.

Are you seeing “green shoots” regarding revenue growth, retention or other momentum in your portfolio as they adapt to the pandemic?
A good example in our portfolio is Segron, a provider of an automated testing platform for applications, databases and enterprise network infrastructure. Software development, deployment and maintenance in enterprise IT ecosystem requires continuous and rigorous testing protocols and as such a lot of manual heavy lifting with highly skilled engineering talent being involved (which can be used in a more productive way elsewhere). The COVID pandemic has kept engineers home (with no ability for remote testing) while driving demand for digital services (and as such demand for a reliable IT ecosystem). The Segron automated framework enables full automation of enterprise testing leading to increased efficiency, cutting operating costs and giving enterprise customers peace of mind and a good night’s sleep regarding their IT infrastructure in the challenging economic environment.

What is a moment that has given you hope in the last month or so? This can be professional, personal or a mix of the two.
I remain impressed by the unshakeable determination of multiple founders and their teams to overcome all the challenges of the unfavorable economic ecosystem.

Sep
15
2020
--

Latent AI makes edge AI workloads more efficient

Latent AI, a startup that was spun out of SRI International, makes it easier to run AI workloads at the edge by dynamically managing workloads as necessary.

Using its proprietary compression and compilation process, Latent AI promises to compress library files by 10x and run them with 5x lower latency than other systems, all while using less power thanks to its new adaptive AI technology, which the company is launching as part of its appearance in the TechCrunch Disrupt Battlefield competition today.

Founded by CEO Jags Kandasamy and CTO Sek Chai, the company has already raised a $6.5 million seed round led by Steve Jurvetson of Future Ventures and followed by Autotech Ventures .

Before starting Latent AI, Kandasamy sold his previous startup OtoSense to Analog Devices (in addition to managing HPE Mid-Market Security business before that). OtoSense used data from sound and vibration sensors for predictive maintenance use cases. Before its sale, the company worked with the likes of Delta Airlines and Airbus.

Image Credits: Latent AI

In some ways, Latent AI picks up some of this work and marries it with IP from SRI International .

“With OtoSense, I had already done some edge work,” Kandasamy said. “We had moved the audio recognition part out of the cloud. We did the learning in the cloud, but the recognition was done in the edge device and we had to convert quickly and get it down. Our bill in the first few months made us move that way. You couldn’t be streaming data over LTE or 3G for too long.”

At SRI, Chai worked on a project that looked at how to best manage power for flying objects where, if you have a single source of power, the system could intelligently allocate resources for either powering the flight or running the onboard compute workloads, mostly for surveillance, and then switch between them as needed. Most of the time, in a surveillance use case, nothing happens. And while that’s the case, you don’t need to compute every frame you see.

“We took that and we made it into a tool and a platform so that you can apply it to all sorts of use cases, from voice to vision to segmentation to time series stuff,” Kandasamy explained.

What’s important to note here is that the company offers the various components of what it calls the Latent AI Efficient Inference Platform (LEIP) as standalone modules or as a fully integrated system. The compressor and compiler are the first two of these and what the company is launching today is LEIP Adapt, the part of the system that manages the dynamic AI workloads Kandasamy described above.

Image Credits: Latent AI

In practical terms, the use case for LEIP Adapt is that your battery-powered smart doorbell, for example, can run in a low-powered mode for a long time, waiting for something to happen. Then, when somebody arrives at your door, the camera wakes up to run a larger model — maybe even on the doorbell’s base station that is plugged into power — to do image recognition. And if a whole group of people arrives at ones (which isn’t likely right now, but maybe next year, after the pandemic is under control), the system can offload the workload to the cloud as needed.

Kandasamy tells me that the interest in the technology has been “tremendous.” Given his previous experience and the network of SRI International, it’s maybe no surprise that Latent AI is getting a lot of interest from the automotive industry, but Kandasamy also noted that the company is working with consumer companies, including a camera and a hearing aid maker.

The company is also working with a major telco company that is looking at Latent AI as part of its AI orchestration platform and a large CDN provider to help them run AI workloads on a JavaScript backend.

Sep
15
2020
--

Data virtualization service Varada raises $12M

Varada, a Tel Aviv-based startup that focuses on making it easier for businesses to query data across services, today announced that it has raised a $12 million Series A round led by Israeli early-stage fund MizMaa Ventures, with participation by Gefen Capital.

“If you look at the storage aspect for big data, there’s always innovation, but we can put a lot of data in one place,” Varada CEO and co-founder Eran Vanounou told me. “But translating data into insight? It’s so hard. It’s costly. It’s slow. It’s complicated.”

That’s a lesson he learned during his time as CTO of LivePerson, which he described as a classic big data company. And just like at LivePerson, where the team had to reinvent the wheel to solve its data problems, again and again, every company — and not just the large enterprises — now struggles with managing their data and getting insights out of it, Vanounou argued.

varada architecture diagram

Image Credits: Varada

The rest of the founding team, David Krakov, Roman Vainbrand and Tal Ben-Moshe, already had a lot of experience in dealing with these problems, too, with Ben-Moshe having served at the chief software architect of Dell EMC’s XtremIO flash array unit, for example. They built the system for indexing big data that’s at the core of Varada’s platform (with the open-source Presto SQL query engine being one of the other cornerstones).

Image Credits: Varada

Essentially, Varada embraces the idea of data lakes and enriches that with its indexing capabilities. And those indexing capabilities is where Varada’s smarts can be found. As Vanounou explained, the company is using a machine learning system to understand when users tend to run certain workloads, and then caches the data ahead of time, making the system far faster than its competitors.

“If you think about big organizations and think about the workloads and the queries, what happens during the morning time is different from evening time. What happened yesterday is not what happened today. What happened on a rainy day is not what happened on a shiny day. […] We listen to what’s going on and we optimize. We leverage the indexing technology. We index what is needed when it is needed.”

That helps speed up queries, but it also means less data has to be replicated, which also brings down the cost. As MizMaa’s Aaron Applbaum noted, since Varada is not a SaaS solution, the buyers still get all of the discounts from their cloud providers, too.

In addition, the system can allocate resources intelligently so that different users can tap into different amounts of bandwidth. You can tell it to give customers more bandwidth than your financial analysts, for example.

“Data is growing like crazy: in volume, in scale, in complexity, in who requires it and what the business intelligence uses are, what the API uses are,” Applbaum said when I asked him why he decided to invest. “And compute is getting slightly cheaper, but not really, and storage is getting cheaper. So if you can make the trade-off to store more stuff, and access things more intelligently, more quickly, more agile — that was the basis of our thesis, as long as you can do it without compromising performance.”

Varada, with its team of experienced executives, architects and engineers, ticked a lot of the company’s boxes in this regard, but he also noted that unlike some other Israeli startups, the team understood that it had to listen to customers and understand their needs, too.

“In Israel, you have a history — and it’s become less and less the case — but historically, there’s a joke that it’s ‘ready, fire, aim.’ You build a technology, you’ve got this beautiful thing and you’re like, ‘alright, we did it,’ but without listening to the needs of the customer,” he explained.

The Varada team is not afraid to compare itself to Snowflake, which at least at first glance seems to make similar promises. Vananou praised the company for opening up the data warehousing market and proving that people are willing to pay for good analytics. But he argues that Varada’s approach is fundamentally different.

“We embrace the data lake. So if you are Mr. Customer, your data is your data. We’re not going to take it, move it, copy it. This is your single source of truth,” he said. And in addition, the data can stay in the company’s virtual private cloud. He also argues that Varada isn’t so much focused on the business users but the technologists inside a company.

 

Sep
15
2020
--

Verkada adds environmental sensors to cloud-based building operations toolkit

As we go deeper into the pandemic, many buildings sit empty or have limited capacity. During times like these, having visibility into the state of the building can give building operations peace of mind. Today, Verkada, a startup that helps operations manage buildings via the cloud, announced a new set of environmental sensors to give customers even greater insight into building conditions.

The company had previously developed cloud-based video cameras and access control systems. Verkada CEO and co-founder of Filip Kaliszan says today’s announcement is about building on these two earlier products.

“What we do today is cameras and access control — cameras, of course provide the eyes and the view into building in spaces, while access control controls how you get in and out of these spaces,” Kaliszan told TechCrunch. Operations teams can manage these devices from the cloud on any device.

The sensor pack that the company is announcing today layers on a multi-function view into the state of the environment inside a building. “The first product that we’re launching along this environmental sensor line is the SV11, which is a very powerful unit with multiple sensors on board, all of which can be managed in the cloud through our Verkada command platform. The sensors will give customers insight into things like air quality, temperature, humidity, motion and occupancy of the space, as well as the noise level,” he said.

There is a clear strategy behind the company’s product road map. The idea is to give building operations staff a growing picture of what’s going on inside the space. “You can think of all the data being combined with the other aspects of our platform, and then begin delivering a truly integrated building and setting the standard for enterprise building security,” Kaliszan said.

These tools, and the ability to access all the data about a building remotely in the cloud, obviously have even more utility during the pandemic. “I think we’re fortunate that our products can help customers mitigate some of the effects of the pandemic. So we’ve seen a lot of customers use our tools to help them manage through the pandemic, which is great. But when we were originally designing this environmental sensor, the rationale behind it were these core use cases like monitoring server rooms for environmental changes.”

The company, which was founded in 2016, has been doing well. It has 4,200 customers and roughly 400 employees. It is still growing and actively hiring and expects to reach 500 by the end of the year. It has raised $138.9 million, the most recent coming January this year, when it raised an $80 million Series C investment led Felicis Ventures on a $1.6 billion valuation.

Sep
14
2020
--

Airtable’s Howie Liu has no interest in exiting, even as the company’s valuation soars

In the middle of a pandemic, Airtable, the low-code startup, has actually had an excellent year. Just the other day, the company announced it had raised $185 million on a whopping $2.585 billion valuation. It also announced some new features that take it from the realm of pure no-code and deeper into low-code territory, which allows users to extend the product in new ways.

Airtable CEO and co-founder Howie Liu was a guest today at TechCrunch Disrupt, where he was interviewed by TechCrunch News Editor Frederic Lardinois.

Liu said that the original vision that has stayed pretty steady since the company launched in 2013 was to democratize software creation. “We believe that more people in the world should become software builders, not just software users, and pretty much the whole time that we’ve been working on this company we’ve been charting our course towards that end goal,” he said.

But something changed recently, where Liu saw people who needed to do a bit more with the tool than that original vision allowed.

“So, the biggest shift that’s happening today with our fundraise and our launch announcement is that we’re going from being a no-code product, a purely no-code solution where you don’t have to use code, but neither can you use code to extend the product to now being a low-code solution, and one that also has a lot more extensibility with other features like automation, allowing people to build logic into Airtable without any technical knowledge,” he said.

In addition, the company, with 200,00 customers, has created a marketplace where users can share applications they’ve built. As the pandemic has taken hold, Liu says that he’s seen a shift in the types of deals he’s been seeing. That’s partly due to small businesses, which were once his company’s bread and butter, suffering more economic pain as a result of COVID.

But he has seen larger enterprise customers fill the void, and it’s not too big a stretch to think that the new extensibility features could be a nod to these more lucrative customers, who may require a bit more power than a pure no-code solution would provide.

“On the enterprise side of our business we’ve seen, for instance this summer, a 5x increase in enterprise deal closing velocity from the prior summer period, and this incredible appetite from enterprise signings with dozens of six-figure deals, some seven-figure deals and thousands of new paid customers overall,” he said.

In spite of this great success, the upward trend of the business and the fat valuation, Liu was in no mood to talk about an IPO. In his view, there is plenty of time for that, and in spite of being a seven-year-old company with great momentum, he says he’s simply not thinking about it.

Nor did he express any interest in being acquired, and he says that his investors weren’t putting any pressure on him to exit.

“It’s always been about finding investors who are really committed and aligned to the long-term goals and approach that we have to this business that matters more to us than the actual valuation numbers or any other kind of technical aspects of the round,” he said.

Sep
14
2020
--

Quantum startup CEO suggests we are only five years away from a quantum desktop computer

Today at TechCrunch Disrupt 2020, leaders from three quantum computing startups joined TechCrunch editor Frederic Lardinois to discuss the future of the technology. IonQ CEO and president Peter Chapman suggested we could be as little as five years away from a desktop quantum computer, but not everyone agreed on that optimistic timeline.

“I think within the next several years, five years or so, you’ll start to see [desktop quantum machines]. Our goal is to get to a rack-mounted quantum computer,” Chapman said.

But that seemed a tad optimistic to Alan Baratz, CEO at D-Wave Systems. He says that when it comes to developing the super-conducting technology that his company is building, it requires a special kind of rather large quantum refrigeration unit called a dilution fridge, and that unit would make a five-year goal of having a desktop quantum PC highly unlikely.

Itamar Sivan, CEO at Quantum Machines, too, believes we have a lot of steps to go before we see that kind of technology, and a lot of hurdles to overcome to make that happen.

“This challenge is not within a specific, singular problem about finding the right material or solving some very specific equation, or anything. It’s really a challenge, which is multidisciplinary to be solved here,” Sivan said.

Chapman also sees a day when we could have edge quantum machines, for instance on a military plane, that couldn’t access quantum machines from the cloud efficiently.

“You know, you can’t rely on a system which is sitting in a cloud. So it needs to be on the plane itself. If you’re going to apply quantum to military applications, then you’re going to need edge-deployed quantum computers,” he said.

One thing worth mentioning is that IonQ’s approach to quantum is very different from D-Wave’s and Quantum Machines’ .

IonQ relies on technology pioneered in atomic clocks for its form of quantum computing. Quantum Machines doesn’t build quantum processors. Instead, it builds the hardware and software layer to control these machines, which are reaching a point where that can’t be done with classical computers anymore.

D-Wave, on the other hand, uses a concept called quantum annealing, which allows it to create thousands of qubits, but at the cost of higher error rates.

As the technology develops further in the coming decades, these companies believe they are offering value by giving customers a starting point into this powerful form of computing, which when harnessed will change the way we think of computing in a classical sense. But Sivan says there are many steps to get there.

“This is a huge challenge that would also require focused and highly specialized teams that specialize in each layer of the quantum computing stack,” he said. One way to help solve that is by partnering broadly to help solve some of these fundamental problems, and working with the cloud companies to bring quantum computing, however they choose to build it today, to a wider audience.

“In this regard, I think that this year we’ve seen some very interesting partnerships form which are essential for this to happen. We’ve seen companies like IonQ and D-Wave, and others partnering with cloud providers who deliver their own quantum computers through other companies’ cloud service,” Sivan said. And he said his company would be announcing some partnerships of its own in the coming weeks.

The ultimate goal of all three companies is to eventually build a universal quantum computer, one that can achieve the goal of providing true quantum power. “We can and should continue marching toward universal quantum to get to the point where we can do things that just can’t be done classically,” Baratz said. But he and the others recognize we are still in the very early stages of reaching that end game.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com