Dropbox boosts its price range for its IPO as it nears an $8B valuation

Dropbox said it would be increasing its IPO price range – the range for which it will sell its shares for its initial public offering — from $16-$18 per share to $18-$20 per share, giving the company a valuation that could reach close to $8 billion, according to an updated filing with the Securities and Exchange Commission.

Including all shares offered from stockholders selling in this offering, the “greenshoe” and the actual IPO, Dropbox will have a valuation between $7.2 billion and $7.96 billion. Based on a fully-diluted share count, Dropbox’s valuation should land between $7.8 billion to $8.75 billion. It’s below Dropbox’s previous $10 billion valuation, but it’s still a signal that investors are interested in buying up Dropbox’s IPO, which will be the most well-known enterprise name to go public this year. Cloud security company Zscaler went public earlier this month and immediately saw a massive pop, but Dropbox will probably be lumped into a similar boat as Snap as a signal to whether investors are going to be interested in hyped startups.

There will indeed be some shareholders selling stock in this offering, though it looks like for the most part the ownership is going to stay the same. There are a lot of reasons to sell a stock beyond just getting liquidation, such as paying taxes for other share options and exercises, so it’s not clear exactly what the motivations are for some employees for now.

Dropbox has more than 500 million users, 11 million of which are paying users. While originally born as a consumer service, the company has sought to crack into the enterprise in order to help build a robust second line of business to tack alongside its typical consumer operations. Dropbox at the start had the benefit of spreading via word of mouth thanks to its dead-simple interface, but since then has started building out new tools geared toward larger businesses, such as Dropbox Paper.

It’s also what’s made this IPO a somewhat tricky one. The process for this is normally the same, with the company setting a price range and then throwing it out there to see who bites. If things go well, the range goes up. If things go poorly, like the case of Blue Apron, the range is going to drop. This could always change at the last minute, but you can take this as another step toward its eventual listing, which is expected to happen later this week.


Microsoft Power Apps update includes new Common Data Service

Microsoft announced the spring update to its Power BI and Power Apps platform today with a significant enhancement, a new common data service that enables companies to build data-based applications from a variety of data sources.

This is part of a wider strategy that is designed to remove some of the complexity associated with gathering, processing and incorporating data into applications.

Microsoft is essentially giving customers access to the same set of tools and services it has used internally to build Dynamics 365, its enterprise suite of tools that includes CRM, marketing automation and field service along with enterprise resource planning tools (ERP).

While the company has been allowing third party developers to build application on the platform for about 18 months with its Power Apps tools, they haven’t been able to take advantage of the data under the hood without some heavy lifting. Microsoft aims to change that with the Common Data Service.

Diagram: Microsoft

“What that service means, practically speaking, is that it’s not only a place to store data, but a model (schema) that is stamped out there with everything you would need to build a business app around [elements] such as contacts, events, customers [and so forth], Ryan Cunningham, Microsoft program manager for Power Apps explained. This allows the programmer to take advantage of pre-built relationships and rules and how they should be enforced without having to code them from scratch.

Cunningham points out that they tried to make it fairly simple to build the apps, while still providing a level of customization and the ability to use Microsoft data or data from another source. That’s where the Common Data Store comes in.

He says that developers can take advantage of the 200 connectors that come pre-built out of the box and connect to all that data you’ve been collecting in the Microsoft products, but they aren’t limited to the Microsoft data. “You can still build custom applications on top of the platform, and get the benefit of the platform we’ve built our tools on,” he said.

The Common Data Store is part of a much broader set of announcements around the spring releases of Dynamics 365, Office 365 and Power BI platforms all announced today.


Travis Kalanick is already back running a company with a $150M investment

Travis Kalanick, the former Uber CEO who was shown the door in June last year amid a series of major controversies, has already found his next leading role following his announcement of a new investment fund just weeks ago.

Kalanick said on Twitter that his fund would be investing $150 million to take a controlling interest in City Storage Systems, or CSS. He will also be running the company as CEO, according to Recode. It’s a holding company focused on redevelopment of distressed real estate. Kalanick resigned from Uber after facing a lawsuit with Waymo over trade secrets, an ongoing battle with existing shareholders Benchmark Capital, and the fallout from a harassment probe led by former attorney general Eric Holder. Uber brought on new CEO Dara Kosrowshahi in August last year.

Travis announced that he would be starting a new fund with his windfall from Uber shares sold in its most recent major secondary round. At the time, Kalanick said the new fund — called 10100, or “ten one hundred” — would be geared toward “large-scale job creation,” with investments in real estate, ecommerce, and “emerging innovation in India and China.” CSS has two businesses, CloudKitchens and CloudRetail, which focus on redevelopment of distressed assets in those two areas.

The former is pretty interesting given that Uber has its own food delivery service, UberEats. Should Kalanick’s new venture find ways to acquire distressed food-related real estate — kitchens around a city, for example — there may be a natural overlap with his experience at Uber as it started to explore food. Having massive operating kitchens located in one area with a delivery fleet associated with it is one thing, but having an array of smaller kitchens redeveloped through a company like CSS could provide a kind of distributed network that might make it easier to get food from one kitchen to its delivery in a shorter period of time.

It’s not that we know CSS is focusing on that explicitly, but Amazon also bought a bunch of buildings for $13.7 billion, and now it has a two-hour delivery service in major metropolitan areas. Of course, Travis was shown the door at Uber, so it remains to be seen how this one is going to play out. The Information notes that CSS was owned by a friend of Kalanick’s as well as having a loose connection with Uber.


Salesforce is buying MuleSoft at enterprise value of $6.5 billion

Salesforce announced today that it intends to buy MuleSoft in a deal valued at a whopping $6.5 billion. That’s not necessarily the selling price, but the amount the company has been valued at based on stocks, bonds and cash on hand. The exact price was not available yet, but the company did indicate it was paying $44.89 per share for MuleSoft, a price that represents a 36 percent premium over yesterday’s closing price, according to Salesforce .

What’s more, the deal values each MuleSoft share at $36 in cash and 0.0711 shares of Salesforce common stock.

Rumors began swirling this morning after a story broke by Reuters that the CRM giant was interested in MuleSoft, which launched in 2006 and went public almost exactly a year ago. With 1,200 customers, it gives Salesforce a mature company to add to its arsenal. It also gives them an API integration engine that should help the company access data across organizations, regardless of where it lives.

This is particularly important for Salesforce, which tends to come in and work with a company across enterprise systems. As it builds out its artificial intelligence and machine learning layer, which it has branded as Einstein, it needs access to data across the company. A company like MuleSoft gives them that.

But of course Salesforce gets more than tech with this purchase, which it can integrate into its growing family of products. It also gets major customers like Coca-Cola, VMware, GE, Accenture, Airbus, AT&T and Cisco. While Salesforce may have a presence in some of these companies already, MuleSoft gives them entrée into areas they might not have had, and gives them the ability to expand that presence.

What’s more, the company has big revenue goals. Having reached $10 billion in revenue faster than any software company ever has, a point that chairman and co-founder Marc Benioff has been happy to make, they have actually set their sights on $60 billion by 2034. That’s a long way away, of course, but having a company like MuleSoft in the fold, which made almost $300 million in revenue in fiscal 2017, will certainly help.

Ray Wang, founder and principal analyst at Constellation Research, says this about building a microservices future, “This is the heart of Salesforce’s M&A strategy. They have to integrate, orchestrate, and manage microservices in their future roadmap,” he said. “The AI-driven world ahead needs contextual microservices.”

Microservices are a way of building applications made up of small, distinct pieces, rather than the single, monolithic application we tended to build in the past. This makes changing and updating easier and more efficient.

Brent Leary, owner and principal at CRM Essentials, a CRM consulting firm, sees the deal through a customer prism. “Well, it shows just how crucial [Internet of Things] and [Artificial Intelligence] is to the future of Salesforce‘s ability to create the customer success platform of the future,” he said.

“It also reinforces that they feel investing deeper into customer success is a better ROI and growth play then extending to other enterprise app areas outside of their core focus,” Leary added.

As with all deals of this ilk, it needs to pass regulatory muster first, but if it does, it is expected to close at the end July.


Oracle’s cloud biz heading in the wrong direction right now

Oracle announced its quarterly earnings last night, detailing that its cloud business grew 32 percent to $1.6 billion in the quarter. That might sound good at first blush, but it’s part of three straight quarters of reduced growth — a fact that had investors jittery over night. It didn’t get better throughout the day today with Oracle’s stock plunging over 9 percent as of this writing.

When you consider that enterprise business is shifting rapidly to the cloud, and that the cloud business in general is growing quickly, Oracle’s cloud numbers could be reason for concern. While it’s hard to nail down what “cloud” means when it comes to technology companies’ earnings because it varies so much in how each one counts infrastructure, software, or platform; the general trend from Oracle seems contrary to the eye-popping growth numbers we have seen from other companies.

Oracle against the world

Oracle’s cloud revenue broke down as follows: SaaS, up 33 percent to $1.2 billion, and platform and infrastructure revenue combined up 28 percent to $415 million. To put those figures into context, consider that last quarter Alibaba reported overall cloud revenue of $533 million,which was up a whopping 104 percent year over year.

Looking purely at Infrastructure services, Canalys reported that in the third quarter of 2017, Microsoft grew at around 90 percent year over year, while Google grew around 75 percent YoY. Even market leader Amazon, which controls over 30 percent of the market, had around a 40 percent growth rate, fairly remarkable given its size.

All of that suggests that Oracle, which came to the cloud late, should be on a higher growth trajectory than it’s currently showing.That’s because it’s generally easier to grow from a small number than it is from a big number to bigger number (as Amazon has had to do).

The company’s on-prem software revenue continues to grow (which includes lucrative license and maintenance revenue from existing customers), and still accounts for the vast majority of its top line. However, at this point, you would think Oracle would want to see that revenue growth shifting away from on-prem and towards its cloud business.

What’s worse is that co-CEO Safra Catz predicted in the earnings call with analysts that the cloud growth could dive even further next quarter. “Cloud revenues including SaaS, PaaS and IaaS [all cloud business combined] are expected to grow 19% to 23% in USD, 17% to 21% in constant currency,” she told analysts this week.

Oracle co-CEO Safra Catz Photo: KIMIHIRO HOSHINO/AFP/Getty Images

Looking for a brighter future

Chairman Larry Ellison tried to point to the fully automated cloud database product announced at Oracle OpenWorld last fall as a proof point of a brighter cloud future, but so far the numbers are not bearing that out. It’s worth noting that he did also indicate that more automated cloud products are on the way.

Oracle has spent the last several years putting a lot of cloud pieces together, and as Catz pointed out, they don’t have to invest further to handle additional capacity in their SaaS business, but with the numbers heading in the wrong direction that may not be the problem.

Oracle certainly has enterprise credibility, and that should bode well for its cloud business, but as a late comer to the market we should be seeing much brisker overall growth than this. Over time that may happen, but for now Wall Street was not happy with Oracle’s results and the firm probably has to show more from its cloud products before they can change investors’ minds.


Windows Server 2019 is now available in preview

Microsoft today announced the next version of Windows Server, which launches later this year under the not completely unexpected moniker of “Windows Server 2019.” Developers and operations teams that want to get access to the bits can now get the first preview build through Microsoft’s Insider Program.

This next version comes with plenty of new features, but it’s also worth noting that this is the next release in the Long-Term Servicing Channel for Windows Server, which means that customers will get five years of mainstream support and can get an extra five years of extended support. Users also can opt for a semi-annual channel that features — surprise — two releases per year for those teams that want to get faster access to new features. Microsoft recommends the long-term option for infrastructure scenarios like running SQL Server or SharePoint.

So what’s new in Windows Server 2019? Given Microsoft’s focus on hybrid cloud deployments, it’s no surprise that Windows Server also embraces these scenarios. Specifically, this means that Windows Server 2019 will be able to easily connect to Microsoft Azure and that users will be able to integrate Azure Backup, File Sync, disaster recover and other services into they Windows Server deployments.

Microsoft also added a number of new security features, which are mostly based on what the company has learned from running Azure and previous version of Windows. These include new shielded VMs for protecting Linux applications and support for Windows Defender Advanced Threat Protection, one of Microsoft’s flagship security products that helps guard machines against attacks and zero-day exploits.

With this release, Microsoft is also bringing its container technologies from the semi-annual release channel to the long-term release channel. These include the ability to run Linux containers on Windows and the Windows Subsystem for Linux that enables this, as well as the ability to run Bash scripts on Windows. And for those of you who are really into containers, Microsoft also today noted that it will offer more container orchestration choices, including Kubernetes support, soon. These will first come to the semi-annual channel, though.

You can find a more detailed breakdown of what’s new in this release here.


Salesforce is reportedly in talks to acquire Mulesoft and the stock is going nuts

After previously investing in Mulesoft, it looks like Salesforce may finish off the deal and is in advanced talks to acquire the data management software provider altogether, according to a report from Reuters this morning.

Mulesoft works with companies to bring together different sources of data like varying APIs. That’s important for companies that have data coming in from all over the place, whether that’s online applications or actual devices, and the company says it has Netflix and Spotify as customers. It would also give Salesforce another piece of the lock-in puzzle for enterprises that need to increasingly manage larger and larger pools of data as they look to start pumping out machine learning tools that can act on all that data.

As usual, these talks could fall apart — we saw this happen with Twitter a few years ago after the company looked at buying what was essentially the largest customer service channel on the planet (as in, great for whining at brands) — but Reuters reports that the deal could be announced as soon as this week. Mulesoft’s stock jumped nearly 20% this year after it went public last year amid a wave of enterprise IPOs jumping through the so-called IPO window while it’s open.

Salesforce is increasingly making a push into AI with products like Einstein, which it launched in 2016. Those tools give businesses predictive services and recommendations, a hallmark of what can come out of increasing piles of data based on customer activity. But all of that data has to come from somewhere, and for now, there are providers outside of the Salesforce ecosystem that stitch all that together. Having it all in one central place makes it easier to parse them through these machine learning algorithms and start building predictive models for their operations.

We reached out to Salesforce and Mulesoft for comment and will update the post when we hear back.


Mythic nets $40M to create a new breed of efficient AI-focused hardware

Another huge financing round is coming in for an AI company today, this time for a startup called Mythic getting a fresh $40 million as it appears massive deals are closing left and right in the sector.

Mythic particularly focuses on the inference side of AI operations — basically making the calculation on the spot for something based off an extensively trained model. The chips are designed to be low power, small, and achieve the same kind of performance you’d expect from a GPU in terms of the lightning-fast operations that algorithms need to perform to figure out whether or not that thing your car is about to run into is a cat or just some text on the road. SoftBank Ventures led this most-recent round of funding, with a strategic investment also coming from Lockheed Martin Ventures. ARM executive Rene Haas will also be joining the company’s board of directors.

“The key to getting really high performance and really good energy efficiency is to keep everything on the chip,” Henry said. “The minute you have to go outside the chip to memory, you lose all performance and energy. It just goes out the window. Knowing that, we found that you can actually leverage flash memory in a very special way. The limit there is, it’s for inference only, but we’re only going after the inference market — it’s gonna be huge. On top of that, the challenge is getting the processors and memory as close together as possible so you don’t have to move around the data on the chip.”

Mythic, like other startups, is looking to ease the back-and-forth trips to memory on the processors in order to speed things up and lower the power consumption, and CEO Michael Henry says the company has figured out how to essentially do the operations — based in a field of mathematics called linear algebra — on flash memory itself.

Mythic’s approach is designed to be what Henry calls more analog. To visualize how it might work, imagine a set-up in Minecraft, with a number of different strings of blocks leading to an end gate. If you flipped a switch to turn 50 of those strings on with some unit value, leaving the rest off, and joined them at the end and saw the combined final result of the power, you would have completed something similar to an addition operation leading to a sum of 50 units. Mythic’s chips are designed to do something not so dissimilar, finding ways to complete those kinds of analog operations for addition and multiplication in order to handle the computational requirements for an inference operation. The end result, Henry says, consumes less power and dissipates less heat while still getting just enough accuracy to get the right solution (more technically: the calculations are 8-bit results).

After that, the challenge is sticking a layer on top of that to make it look and behave like a normal chip to a developer. The goal is to, like other players in the AI hardware space, just plug into frameworks like TensorFlow. Those frameworks abstract out all the complicated tooling and tuning required for such a specific piece of hardware and make it very approachable and easy for developers to start building machine learning projects. Andrew Feldman, CEO of another AI hardware startup called Cerebras Systems, said at the Goldman Sachs Technology and Internet conference last month that frameworks like TensorFlow had  most of the value Nvidia had building up an ecosystem for developers on its own system.

Henry, too, is a big TensorFlow fan. And for good reason: it’s because of frameworks like TensorFlow that allow next-generation chip ideas to even get off the ground in the first place. These kinds of frameworks, which have become increasingly popular with developers, have abstracted out the complexity of working with specific low-level hardware like a field programmable gate array (FPGA) or a GPU. That’s made building machine learning-based operations much easier for developers and led to an explosion of activity when it comes to machine learning, whether it’s speech or image recognition among a number of other use cases.

“Things like TensorFlow make our lives so much easier,” Henry said. “Once you have a neural network described on TensorFlow, it’s on us to take that and translate that onto our chip. We can abstract that difficulty by having an automatic compiler.”

While many of these companies are talking about getting massive performance gains over a GPU — and, to be sure, Henry hopes that’ll be the case — the near term goal for Mythic is to match the performance of a $1,000 GPU while showing it can take up less space and consume less power. There’s a market for the card that customers can hot swap in right away. Henry says the company is focused on using a PCI-E interface, a very common plug-and-play system, and that’s it.

The challenge for Mythic, however, is going to get into the actual design of some of the hardware that comes out. It’s one thing to sell a bunch of cards that companies can stick into their existing hardware, but it’s another to get embedded into the actual pieces of hardware themselves — which is what’s going to need to happen if it wants to be a true workhorse for devices on the edge, like security cameras or things handling speech recognition. That makes the buying cycle a little more difficult, but at the same time, there will be billions of devices out there that need advanced hardware to power their inference operations.

“If we can sell a PCI card, you buy it and drop it in right away, but those are usually for low-volume, high-selling price products,” Henry said. “The other customers we serve design you into the hardware products. That’s a longer cycle, that can take upwards of a year. For that, typically the volumes are much higher. The nice thing is that you’re really really sticky. If they design you into a product you’re really sticky. We can go after both, we can go after board sales, and then go after design.”

There are probably going to be two big walls to Mythic, much less any of the other players out there. The first is that none of these companies have shipped a product. While Mythic, or other companies, might have a proof-of-concept chip that can drop on the table, getting a production-ready piece of next-generation silicon is a dramatic undertaking. Then there’s the process of not only getting people to buy the hardware, but actually convincing them that they’ll have the systems in place to ensure that developers will build on that hardware. Mythic says it plans to have a sample for customers by the end of the year, with a production product by 2019.

That also explains why Mythic, along with those other startups, are able to raise enormous rounds of money — which means there’s going to be a lot of competition amongst all of them. Here’s a quick list of what fundraising has happened so far: SambaNova Systems raised $56 million last week; Graphcore raised $50 million in November last year; Cerebras Systems’s first round was $25 million in December 2016; and this isn’t even counting an increasing amount of activity happening among companies in China. There’s still definitely a segment of investors that consider the space way too hot (and there is, indeed, a ton of funding) or potentially unnecessary if you don’t need the bleeding edge efficiency or power of these products.

And there are, of course, the elephants in the room in the form of Nvidia and to a lesser extent Intel. The latter is betting big on FPGA and other products, while Nvidia has snapped up most of the market thanks to GPUs being much more efficient at the kind of math needed for AI. The play for all these startups is they can be faster, more efficient, or in the case of Mythic, cheaper than all those other options. It remains to be seen whether they’ll unseat Nvidia, but nonetheless there’s an enormous amount of funding flowing in.

“The question is, is someone going to be able to beat Nvidia when they have the valuation and cash reserves,” Henry said. “But the thing, is we’re in a different market. We’re going after the edge, we’re going after things embedded inside phones and cars and drones and robotics, for applications like AR and VR, and it’s just really a different market. When investors analyze us they have to think of us differently. They don’t think, is this the one that wins Nvidia, they think, are one or more of these powder keg markets explode. It’s a different conversation for us because we’re an edge company.”


DocuSign has filed confidentially for IPO

DocuSign is gearing up to go public in the next six months, sources tell TechCrunch.

The company, which pioneered the e-signature, has now filed confidentially, we are hearing. Utilizing a commonly used provision of the JOBS Act, DocuSign submitted its IPO filing behind closed doors and will reveal it weeks before its public debut.

Like Dropbox, which is finally going public this week, San Francisco-based DocuSign has been an anticipated IPO for several years now. It’s raised over $500 million since it was founded in 2003 and has been valued at $3 billion. Kleiner Perkins, Bain Capital, Intel Capital, GV (Google Ventures) and Dell are amongst the many well-known names which have invested in DocuSign.

But like many “unicorns” these days, the company took its time, spending 15 years as a private company. The DocuSign team decided that 2018 is the year for its debut and is targeting an IPO in either the second or third quarter.

DocuSign, which competes with HelloSign and Adobe Sign, amongst others, has been on a mission to get the world’s businesses to sign documents online. The team has worked with large enterprises like T-Mobile, Salesforce, Morgan Stanley and Bank of America.

Real estate, financial services, insurance and healthcare are amongst its key industries. Legal, sales and human resource departments frequently use DocuSign to send and sign documents.

In addition to large enterprises, DocuSign also offers services for small and mid-sized businesses. Individual consumers are able to use DocuSign services, too.

The company has a tiered business model, with corporations paying more for added services. Public investors will be evaluating DocuSign both on its revenue growth and customer retention.

North America is its largest market, but it’s also been focused on expanding throughout the world, including the U.K., France, Australia, Brazil, Singapore and Japan.

Since its inception, DocuSign has undergone several management changes. Early last year, Dan Springer took the helm. He was formerly CEO of Responsys, which went public and then was bought by Oracle for $1.5 billion.

Keith Krach, who is now chairman, had been running the company since 2011. Krach was previously CEO of Ariba, which was acquired by SAP for $4.3 billion.

DocuSign declined to comment.

The past few years have been slow for tech IPOs, which is a disappointment for Silicon Valley venture capitalists who can make a lot of money this way. But for the most part, enterprise tech companies have fared better than consumer tech companies, because strong customer retention makes it easier to predict growth.

Dropbox will debut this week and Spotify is slated to go public in April through a different process known as a “direct listing.” Zuora also recently revealed its IPO filing, implying that the company is expecting to debut in the coming weeks.


Apple, IBM add machine learning to partnership with Watson-Core ML coupling

Apple and IBM may seem like an odd couple, but the two companies have been working closely together for several years now. That has involved IBM sharing its enterprise expertise with Apple and Apple sharing its design sense with IBM. The companies have actually built hundreds of enterprise apps running on iOS devices. Today, they took that friendship a step further when they announced they were providing a way to combine IBM Watson machine learning with Apple Core ML to make the business apps running on Apple devices all the more intelligent.

The way it works is that a customer builds a machine learning model using Watson, taking advantage of data in an enterprise repository to train the model. For instance, a company may want to help field service techs point their iPhone camera at a machine and identify the make and model to order the correct parts. You could potentially train a model to recognize all the different machines using Watson’s image recognition capability.

The next step is to convert that model into Core ML and include it in your custom app. Apple introduced Core ML at the Worldwide Developers Conference last June as a way to make it easy for developers to move machine learning models from popular model building tools like TensorFlow, Caffe or IBM Watson to apps running on iOS devices.

After creating the model, you run it through the Core ML converter tools and insert it in your Apple app. The agreement with IBM makes it easier to do this using IBM Watson as the model building part of the equation. This allows the two partners to make the apps created under the partnership even smarter with machine learning.

“Apple developers need a way to quickly and easily build these apps and leverage the cloud where it’s delivered. [The partnership] lets developers take advantage of the Core ML integration,” Mahmoud Naghshineh, general manager for IBM Partnerships and Alliances explained.

To make it even easier, IBM also announced a cloud console to simplify the connection between the Watson model building process and inserting that model in the application running on the Apple device.

Over time, the app can share data back with Watson and improve the machine learning algorithm running on the edge device in a classic device-cloud partnership. “That’s the beauty of this combination. As you run the application, it’s real time and you don’t need to be connected to Watson, but as you classify different parts [on the device], that data gets collected and when you’re connected to Watson on a lower [bandwidth] interaction basis, you can feed it back to train your machine learning model and make it even better,” Naghshineh said.

The point of the partnership has always been to use data and analytics to build new business processes, by taking existing approaches and reengineering them for a touch screen.

“This adds a level of machine learning to that original goal moving it forward to take advantage of the latest tech. “We are taking this to the next level through machine learning. We are very much on that path and bringing improved accelerated capabilities and providing better insight to [give users] a much greater experience,” Naghshineh said.

Powered by WordPress | Theme: Aeros 2.0 by