Oct
15
2018
--

Celonis brings intelligent process automation software to cloud

Celonis has been helping companies analyze and improve their internal processes using machine learning. Today the company announced it was providing that same solution as a cloud service with a few nifty improvements you won’t find on prem.

The new approach, called Celonis Intelligent Business Cloud, allows customers to analyze a workflow, find inefficiencies and offer improvements very quickly. Companies typically follow a workflow that has developed over time and very rarely think about why it developed the way it did, or how to fix it. If they do, it usually involves bringing in consultants to help. Celonis puts software and machine learning to bear on the problem.

Co-founder and CEO Alexander Rinke says that his company deals with massive volumes of data and moving all of that to the cloud makes sense. “With Intelligent Business Cloud, we will unlock that [on prem data], bring it to the cloud in a very efficient infrastructure and provide much more value on top of it,” he told TechCrunch.

The idea is to speed up the whole ingestion process, allowing a company to see the inefficiencies in their business processes very quickly. Rinke says it starts with ingesting data from sources such as Salesforce or SAP and then creating a visual view of the process flow. There may be hundreds of variants from the main process workflow, but you can see which ones would give you the most value to change, based on the number of times the variation occurs.

Screenshot: Celonis

By packaging the Celonis tools as a cloud service, they are reducing the complexity of running and managing it. They are also introducing an app store with over 300 pre-packaged options for popular products like Salesforce and ServiceNow and popular process like order to cash. This should also help get customers up and running much more quickly.

New Celonis App Store. Screenshot: Celonis

The cloud service also includes an Action Engine, which Rinke describes as a big step toward moving Celonis from being purely analytical to operational. “Action Engine focuses on changing and improving processes. It gives workers concrete info on what to do next. For example in process analysis, it would notice on time delivery isn’t great because order to cash is to slow. It helps accelerate changes in system configuration,” he explained.

Celonis Action Engine. Screenshot: Celonis

The new cloud service is available today. Celonis was founded in 2011. It has raised over $77 million. The most recent round was a $50 million Series B on a valuation over $1 billion.

Oct
15
2018
--

Truphone, an eSIM mobile carrier that works with Apple, raises another $71M, now valued at $507M

Truphone — a UK startup that provides global mobile voice and data services by way of an eSIM model for phones, tablets and IoT devices — said that it has raised another £18 million ($23.7 million) in funding; plus it said it has secured £36 million ($47 million) more “on a conditional basis” to expand its business after signing “a number of high-value deals.”

It doesn’t specify which deals these are, but Truphone was an early partner of Apple’s to provide eSIM-based connectivity to the iPad — that is, a way to access a mobile carrier without having to swap in a physical SIM card, which has up to now been the standard for GMSA-based networks. Truphone is expanding on this by offering a service for new iPhone XS and XR models, taking advantage of the dual SIM capability in these devicews. Truphone says that strategic partners of the company include Apple (“which chose Truphone as the only carrier to offer global data, voice and text plans on the iPad and iPhone digital eSIM”); Synopsys, which has integrated Truphone’s eSIM technology into its chipset designs; and Workz Group, a SIM manufacturer, which has a license from Truphone for its GSMA-accredited remote SIM provisioning platform and SIM operating system.

The company said that this funding, which was made by way of a rights issue, values Truphone at £386 million ($507 million at today’s rates) post-money. Truphone told TechCrunch that the funding came from Vollin Holdings and Minden Worldwide — two investment firms with ties to Roman Abramovich, the Russian oligarch who also owns the Chelsea football club, among other things — along with unspecified minority shareholders. Collectively, Abramovich-connected entities control more than 80 percent of the company.

We have asked the company for more detail on what the conditions are for the additional £36 million in funding to be released and all it is willing to say is that “it’s KPI-driven and related to the speed of growth in the business.” It’s unclear what the state of the business is at the moment because Truphone has not updated its accounts at Companies House (they are overdue). We have asked about that, too.

For some context, Truphone most recently raised money almost exactly a year ago, when it picked up £255 million also by way of a rights issue, and also from the same two big investors. The large amount that time was partly being raised to retire debt. That deal was done at a valuation of £370 million ($491 million at the time of the deal). Going just on sterling values, this is a slight down-round.

Truphone, however, says that business is strong right now:

“The appetite for our technology has been enormous and we are thrilled that our investors have given us the opportunity to accelerate and scale these groundbreaking products to market,” said Ralph Steffens, CEO, Truphone, in a statement. “We recognised early on that the more integrated the supply chain, the smoother the customer experience. That recognition paid off—not just for our customers, but for our business. Because we have this capability, we can move at a speed and proficiency that has never before seen in our industry. This investment is particularly important because it is testament not just to our investors’ confidence in our ambitions, but pride in our accomplishments and enthusiasm to see more of what we can do.”

Truphone is one of a handful of providers that is working with Apple to provide plans for the digital eSIM by way of the MyTruphone app. Essentially this will give users an option for international data plans while travelling — Truphone’s network covers 80 countries — without having to swap out the SIMs for their home networks.

The eSIM technology is bigger than the iPhone itself, of course: some believe it could be the future of how we connect on mobile networks. On phones and tablets, it does away with users ordering, and inserting or swapping small, fiddly chips into their devices (that ironically is also one reason that carriers have been resistant to eSIMs traditionally: it makes it much easier for their customers to churn away). And in IoT networks where you might have thousands of connected, unmanned devices, this becomes one way of scaling those networks.

“eSIM technology is the next big thing in telecommunications and the impact will be felt by everyone involved, from consumers to chipset manufacturers and all those in-between,” said Steve Alder, chief business development officer at Truphone. “We’re one of only a handful of network operators that work with the iPhone digital eSIM. Choosing Truphone means that your new iPhone works across the world—just as it was intended.” Of note, Alder was the person who brokered the first iPhone carrier deal in the UK, when he was with O2.

However, one thing to consider when sizing up the eSIM market is that rollout has been slow so far: there are around 10 countries where there are carriers that support eSIM for handsets. Combining that with machine-to-machine deployments, the market is projected to be worth $254 million this year. However, forecasts put that the market size at $978 million by 2023, possibly pushed along by hardware companies like Apple making it an increasingly central part of the proposition, initially as a complement to a “home carrier.”

Truphone has not released numbers detailing how many devices are using its eSIM services at the moment — either among enterprises or consumers — but it has said that customers include more than 3,500 multinational enterprises in 196 countries. We have asked for more detail and will update this post as we learn more.

Oct
12
2018
--

Anaplan hits the ground running with strong stock market debut up over 42 percent

You might think that Anaplan CEO, Frank Calderoni would have had a few sleepless nights this week. His company picked a bad week to go public as market instability rocked tech stocks. Still he wasn’t worried, and today the company had by any measure a successful debut with the stock soaring up over 42 percent. As of 4 pm ET, it hit $24.18, up from the IPO price of $17. Not a bad way to launch your company.

Stock Chart: Yahoo Finance

“I feel good because it really shows the quality of the company, the business model that we have and how we’ve been able to build a growing successful business, and I think it provides us with a tremendous amount of opportunity going forward,” Calderoni told TechCrunch.

Calderoni joined the company a couple of years ago, and seemed to emerge from Silicon Valley central casting as former CFO at Red Hat and Cisco along with stints at IBM and SanDisk. He said he has often wished that there were a tool around like Anaplan when he was in charge of a several thousand person planning operation at Cisco. He indicated that while they were successful, it could have been even more so with a tool like Anaplan.

“The planning phase has not had much change in in several decades. I’ve been part of it and I’ve dealt with a lot of the pain. And so having something like Anaplan, I see it’s really being a disrupter in the planning space because of the breadth of the platform that we have. And then it goes across organizations to sales, supply chain, HR and finance, and as we say, really connects the data, the people and the plan to make for better decision making as a result of all that,” he said.

Calderoni describes Anaplan as a planning and data analysis tool. In his previous jobs he says that he spent a ton of time just gathering data and making sure they had the right data, but precious little time on analysis. In his view Anaplan, lets companies concentrate more on the crucial analysis phase.

“Anaplan allows customers to really spend their time on what I call forward planning where they can start to run different scenarios and be much more predictive, and hopefully be able to, as we’ve seen a lot of our customers do, forecast more accurately,” he said.

Anaplan was founded in 2006 and raised almost $300 million along the way. It achieved a lofty valuation of $1.5 billion in its last round, which was $60 million in 2017. The company has just under 1000 customers including Del Monte, VMware, Box and United.

Calderoni says although the company has 40 percent of its business outside the US, there are plenty of markets left to conquer and they hope to use today’s cash infusion in part to continue to expand into a worldwide company.

Oct
12
2018
--

IBM files formal JEDI protest a day before bidding process closes

IBM announced yesterday that it has filed a formal protest with the U.S. Government Accountability Office over the structure of the Pentagon’s winner-take-all $10 billion, 10-year JEDI cloud contract. The protest came just a day before the bidding process is scheduled to close. As IBM put it in a blog post, they took issues with the single vendor approach. They are certainly not alone.

Just about every vendor short of Amazon, which has remained mostly quiet, has been complaining about this strategy. IBM certainly faces a tough fight going up against Amazon and Microsoft.

IBM doesn’t disguise the fact that it thinks the contract has been written for Amazon to win and they believe the one-vendor approach simply doesn’t make sense. “No business in the world would build a cloud the way JEDI would and then lock in to it for a decade. JEDI turns its back on the preferences of Congress and the administration, is a bad use of taxpayer dollars and was written with just one company in mind.” IBM wrote in the blog post explaining why it was protesting the deal before a decision was made or the bidding was even closed.

For the record, DOD spokesperson Heather Babb told TechCrunch last month that the bidding is open and no vendor is favored. “The JEDI Cloud final RFP reflects the unique and critical needs of DOD, employing the best practices of competitive pricing and security. No vendors have been pre-selected,” she said.

Much like Oracle, which filed a protest of its own back in August, IBM is a traditional vendor that was late to the cloud. It began a journey to build a cloud business in 2013 when it purchased Infrastructure as a Service vendor SoftLayer and has been using its checkbook to buy software services to add on top of SoftLayer ever since. IBM has concentrated on building cloud services around AI, security, big data, blockchain and other emerging technologies.

Both IBM and Oracle have a problem with the one-vendor approach, especially one that locks in the government for a 10-year period. It’s worth pointing out that the contract actually is an initial two-year deal with two additional three year options and a final two year option. The DOD has left open the possibility this might not go the entire 10 years.

It’s also worth putting the contract in perspective. While 10 years and $10 billion is nothing to sneeze at, neither is it as market altering as it might appear, not when some are predicting the cloud will be $100 billion a year market very soon.

IBM uses the blog post as a kind of sales pitch as to why it’s a good choice, while at the same time pointing out the flaws in the single vendor approach and complaining that it’s geared toward a single unnamed vendor that we all know is Amazon.

The bidding process closes today, and unless something changes as a result of these protests, the winner will be selected next April

Oct
11
2018
--

New Relic acquires Belgium’s CoScale to expand its monitoring of Kubernetes containers and microservices

New Relic, a provider of analytics and monitoring around a company’s internal and external facing apps and services to help optimise their performance, is making an acquisition today as it continues to expand a newer area of its business, containers and microservices. The company has announced that it has purchased CoScale, a provider of monitoring for containers and microservices, with a specific focus on Kubernetes.

Terms of the deal — which will include the team and technology — are not being disclosed, as it will not have a material impact on New Relic’s earnings. The larger company is traded on the NYSE (ticker: NEWR) and has been a strong upswing in the last two years, and its current market cap its around $4.6 billion.

Originally founded in Belgium, CoScale had raised $6.4 million and was last valued at $7.9 million, according to PitchBook. Investors included Microsoft (via its ScaleUp accelerator), PMV and the Qbic Fund, two Belgian investors.

We are thrilled to bring CoScale’s knowledge and deeply technical team into the New Relic fold,” noted Ramon Guiu, senior director of product management at New Relic. “The CoScale team members joining New Relic will focus on incorporating CoScale’s capabilities and experience into continuing innovations for the New Relic platform.”

The deal underscores how New Relic has had to shift in the last couple of years: when the company was founded years ago, application monitoring was a relatively easy task, with the web and a specified number of services the limit of what needed attention. But services, apps and functions have become increasingly complex and now tap data stored across a range of locations and devices, and processing everything generates a lot of computing demand.

New Relic first added container and microservices monitoring to its stack in 2016. That’s a somewhat late arrival to the area, New Relic CEO Lew Cirne believes that it’s just at the right time, dovetailing New Relic’s changes with wider shifts in the market.

‘We think those changes have actually been an opportunity for us to further differentiate and further strengthen our thesis that the New Relic  way is really the most logical way to address this,” he told my colleague Ron Miller last month. As Ron wrote, Cirne’s take is that New Relic has always been centered on the code, as opposed to the infrastructure where it’s delivered, and that has helped it make adjustments as the delivery mechanisms have changed.

New Relic already provides monitoring for Kubernetes, Google Kubernetes Engine (GKE), Amazon Elastic Container Service for Kubernetes (EKS), Microsoft Azure Kubernetes Service (AKS), and RedHat Openshift, and the idea is that CoScale will help it ramp up across that range, while also adding Docker and OpenShift to the mix, as well as offering new services down the line to serve the DevOps community.

“The visions of New Relic and CoScale are remarkably well aligned, so our team is excited that we get to join New Relic and continue on our journey of helping companies innovate faster by providing them visibility into the performance of their modern architectures,” said CoScale CEO Stijn Polfliet, in a statement. “[Co-founder] Fred [Ryckbosch] and I feel like this is such an exciting space and time to be in this market, and we’re thrilled to be teaming up with the amazing team at New Relic, the leader in monitoring modern applications and infrastructure.”

Oct
11
2018
--

Zuora partners with Amazon Pay to expand subscription billing options

Zuora, the SaaS company helping organizations manage payments for subscription businesses, announced today that it had been selected as a Premier Partner in the Amazon Pay Global Partner Program. 

The “Premier Partner” distinction means businesses using Zuora’s billing platform can now easily integrate Amazon’s digital payment system as an option during checkout or recurring payment processes. 

The strategic rationale for Zuora is clear, as the partnership expands the company’s product offering to prospective and existing customers.  The ability to support a wide array of payment methodologies is a key value proposition for subscription businesses that enables them to service a larger customer base and provide a more seamless customer experience.

It also doesn’t hurt to have a deep-pocketed ally like Amazon in a fairly early-stage industry.  With omnipotent tech titans waging war over digital payment dominance, Amazon has reportedly doubled down on efforts to spread Amazon Pay usage, cutting into its own margins and offering incentives to retailers.

As adoption of Amazon Pay spreads, subscription businesses will be compelled to offer the service as an available payment option and Zuora should benefit from supporting early billing integration.

For Amazon Pay, teaming up with Zuora provides direct access to Zuora’s customer base, which caters to tens of millions of subscribers. 

With Zuora minimizing the complexity of adding additional payment options, which can often disrupt an otherwise unobtrusive subscription purchase experience, the partnership with Zuora should help spur Amazon Pay adoption and reduce potential friction.

“By extending the trust and convenience of the Amazon experience to Zuora, merchants around the world can now streamline the subscription checkout experience for their customers,” said Vice President of Amazon Pay, Patrick Gauthier.  “We are excited to be working with Zuora to accelerate the Amazon Pay integration process for their merchants and provide a fast, simple and secure payment solution that helps grow their business.”

The world subscribed

The collaboration with Amazon Pay represents another milestone for Zuora, which completed its IPO in April of this year and is now looking to further differentiate its offering from competing in-house systems or large incumbents in the Enterprise Resource Planning (ERP) space, such as Oracle or SAP.   

Going forward, Zuora hopes to play a central role in ushering a broader shift towards a subscription-based economy. 

Tien Tzuo, founder and CEO of Zuora, told TechCrunch he wants the company to help businesses first realize they should be in the subscription economy and then provide them with the resources necessary to flourish within it.

“Our vision is the world subscribed.”  said Tzuo. “We want to be the leading company that has the right technology platform to get companies to be successful in the subscription economy.”

The partnership will launch with publishers “The Seattle Times” and “The Telegraph”, with both now offering Amazon Pay as a payment method while running on the Zuora platform.

Oct
11
2018
--

Snowflake scoops up another blizzard of cash with $450 million round

When Snowflake, the cloud data warehouse, landed a $263 million investment earlier this year, CEO Bob Muglia speculated that it would be the last money his company would need before an eventual IPO. But just 9 months after that statement, the company announced a second even larger round. This time it’s getting $450 million, as an unexpected level of growth led them to seek additional cash.

Sequoia Capital led the round, joined by new investor Meritech Capital and existing investors Altimeter Capital, Capital One Growth Ventures, Madrona Venture Group, Redpoint Ventures, Sutter Hill Ventures Iconiq Capital and Wing Ventures. Today’s round brings the total raised to over $928 million with $713 million coming just this year. That’s a lot of dough.

Oh and the valuation has skyrocketed too from $1.5 billion in January to $3.5 billion with today’s investment. “We are increasing the valuation from the prior round substantially, and it’s driven by the growth numbers of almost quadrupling the revenue, and tripling the customer base,” company CFO Thomas Tuchscherer told TechCrunch.

At the time of the $263 million round, Muglia was convinced the company had enough funds and that the next fundraise would be an IPO. “We have put ourselves on the path to IPO. That’s our mid- to long-term plan. This funding allows us to go directly to IPO and gives us sufficient capital, that if we choose, IPO would be our next funding step,” he said in January.

Tuchscherer said in fact that was the plan at the time of the first batch of funding. He joined the company, partly because of his experience bringing Talend public in 2016, but he said the growth has been so phenomenal, that they felt it was necessary to change course.

“When we raised $263 million earlier in the year, we raised based on a plan that was ambitious in terms of growth and investment. We are exceeding and beating that, and it prompted us to explore how do we accelerate investment to continue driving the company’s growth,” he said.

Running on both Amazon Web Services and Microsoft Azure, which they added as a supported platform earlier this year, certainly contributed to the increased sales, and forced them to rethink the amount of money it would take to fuel their growth spurt.

“I think it’s very important as a distinction that we view the funding as being customer driven in the sense that in order to meet the demand that we’re seeing in the market for Snowflake, we have to invest in our infrastructure, as well as in our R&D capacity. So  the funding that we’re raising now is meant to finance those two core investments,” he stressed

The number of employees is skyrocketing as the company adds customers. Just eight months ago the company had around 350 employees. Today it has close to 650. Tuchscherer expects that to grow to between 900 and 1000 by the end of January, not that far off.

As for that IPO, surely that is still a goal, but the growth simply got in the way. “We are building the company to be autonomous and to be a large independent company. It’s definitely on the horizon,” he said.

While Tuchscherer wouldn’t definitively say that the company is looking to support at least one more cloud platform in addition to Amazon and Microsoft, he strongly hinted that such a prospect could happen.

The company also plans to plunge a lot of money into the sales team, building out new sales offices in the US and doubling their presence around the world, while also enhancing the engineering and R&D teams to expand their product offerings.

Just this year alone the company has added Netflix, Office Depot, DoorDash, Netgear, Ebates and Yamaha as customers. Other customers include Capital One, Lionsgate and Hubspot.

Oct
10
2018
--

Google+ for G Suite lives on and gets new features

You thought Google+ was dead, didn’t you? And it is — if you’re a consumer. But the business version of Google’s social network will live on for the foreseeable future — and it’s getting a bunch of new features today.

Google+ for G Suite isn’t all that different from the Google+ for consumers, but its focus is very much on allowing users inside a company to easily share information. Current users include the likes of Nielsen and French retailer Auchan.

The new features that Google is announcing today give admins more tools for managing and reviewing posts, allow employees to tag content and provide better engagement metrics to posters.

Recently Google introduced the ability for admins to bulk-add groups of users to a Google+ community, for example. And soon, those admins will be able to better review and moderate posts made by their employees. Soon, admins will also be able to define custom streams so that employees could get access to a stream with all of the posts from a company’s leadership team, for example.

But what’s maybe more important in this context is that tags now make it easy for employees to route content to everybody in the company, no matter which group they work in. “Even if you don’t know all employees across an organization, tags makes it easier to route content to the right folks,” the company explains in today’s blog post. “Soon you’ll be able to draft posts and see suggested tags, like #research or #customer-insights when posting customer survey results.”

As far as the new metrics go, there’s nothing all that exciting going on here, but G Suite customers who keep their reporting structure in the service will be able to provide analytics to employees so they can see how their posts are being viewed across the company and which teams engage most with them.

At the end of the day, none of these are revolutionary features. But the timing of today’s announcement surely isn’t a coincidence, given that Google announced the death of the consumer version of Google+ — and the bug that went along with that — only a few days ago. Today’s announcement is clearly meant to be a reminder that Google+ for the enterprise isn’t going away and remains in active development. I don’t think all that many businesses currently use Google+, though, and with Hangouts Chat and other tools, they now have plenty of options for sharing content across groups.

Oct
10
2018
--

Google’s Apigee officially launches its API monitoring service

It’s been about two years since Google acquired API management service Apigee. Today, the company is announcing new extensions that make it easier to integrate the service with a number of Google Cloud services, as well as the general availability of the company’s API monitoring solution.

Apigee API monitoring allows operations teams to get more insight into how their APIs are performing. The idea here is to make it easy for these teams to figure out when there’s an issue and what’s the root cause for it by giving them very granular data. “APIs are now part of how a lot of companies are doing business,” Ed Anuff, Apigee’s former SVP of product strategy and now Google’s product and strategy lead for the service, told me. “So that tees up the need for API monitoring.”

Anuff also told me that he believes that it’s still early days for enterprise API adoption — but that also means that Apigee is currently growing fast as enterprise developers now start adopting modern development techniques. “I think we’re actually still pretty early in enterprise adoption of APIs,” he said. “So what we’re seeing is a lot more customers going into full production usage of their APIs. A lot of what we had seen before was people using it for maybe an experiment or something that they were doing with a couple of partners.” He also attributed part of the recent growth to customers launching more mobile applications where APIs obviously form the backbone of much of the logic that drives those apps.

API Monitoring was already available as a beta, but it’s now generally available to all Apigee customers.

Given that it’s now owned by Google, it’s no surprise that Apigee is also launching deeper integrations with Google’s cloud services now — specifically services like BigQuery, Cloud Firestore, Pub/Sub, Cloud Storage and Spanner. Some Apigee customers are already using this to store every message passed through their APIs to create extensive logs, often for compliance reasons. Others use Cloud Firestore to personalize content delivery for their web users or to collect data from their APIs and then send that to BigQuery for analysis.

Anuff stressed that Apigee remains just as open to third-party integrations as it always was. That is part of the core promise of APIs, after all.

Oct
10
2018
--

Google introduces dual-region storage buckets to simplify data redundancy

Google is playing catch-up in the cloud, and as such it wants to provide flexibility to differentiate itself from AWS and Microsoft. Today, the company announced a couple of new options to help separate it from the cloud storage pack.

Storage may seem stodgy, but it’s a primary building block for many cloud applications. Before you can build an application you need the data that will drive it, and that’s where the storage component comes into play.

One of the issues companies have as they move data to the cloud is making sure it stays close to the application when it’s needed to reduce latency. Customers also require redundancy in the event of a catastrophic failure, but still need access with low latency. The latter has been a hard problem to solve until today when Google introduced a new dual-regional storage option.

As Google described it in the blog post announcing the new feature, “With this new option, you write to a single dual-regional bucket without having to manually copy data between primary and secondary locations. No replication tool is needed to do this and there are no network charges associated with replicating the data, which means less overhead for you storage administrators out there. In the event of a region failure, we transparently handle the failover and ensure continuity for your users and applications accessing data in Cloud Storage.”

This allows companies to have redundancy with low latency, while controlling where it goes without having to manually move it should the need arise.

Knowing what you’re paying

Companies don’t always require instant access to data, and Google (and other cloud vendors) offer a variety of storage options, making it cheaper to store and retrieve archived data. As of today, Google is offering a clear way to determine costs, based on customer storage choice types. While it might not seem revolutionary to let customers know what they are paying, Dominic Preuss, Google’s director of product management says it hasn’t always been a simple matter to calculate these kinds of costs in the cloud. Google decided to simplify it by clearly outlining the costs for medium (Nearline) and long-term (Coldline) storage across multiple regions.

As Google describes it, “With multi-regional Nearline and Coldline storage, you can access your data with millisecond latency, it’s distributed redundantly across a multi-region (U.S., EU or Asia), and you pay archival prices. This is helpful when you have data that won’t be accessed very often, but still needs to be protected with geographically dispersed copies, like media archives or regulated content. It also simplifies management.”

Under the new plan, you can select the type of storage you need, the kind of regional coverage you want and you can see exactly what you are paying.

Google Cloud storage pricing options. Chart: Google

Each of these new storage services has been designed to provide additional options for Google Cloud customers, giving them more transparency around pricing and flexibility and control over storage types, regions and the way they deal with redundancy across data stores.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com