Quantcast
Channel: Analyst POV
Viewing all 87 articles
Browse latest View live

Cloud Market Update Germany 2015: Cloudy with a Chance of Digital Enterprises

0
0

The hard numbers speak for themselves. Those who still assume that the cloud is just a playground for developers, must learn a lesson from the $1.8bn revenue ($391m net profit) of Amazon Web Services. Such numbers cannot be accomplished with a couple of small workloads but are rather a clear indication of increased relevance for corporate customers. The strengths that AWS has proven on the international stage continuously increase in Germany and have a positive impact on the German cloud market.

„First they ignore you, then they laugh at you, then they fight you, then you win.“ The evergreen quote from Mahatma Gandhi is also relevant to the cloud story. In the beginning, the cloud was laughingly labeled as „the hype“; then followed the battle with partially erroneous arguments, among which the ever-recurring topics of Data Protection & Data Security, well covered by the media and unfairly used interchangeably. At the end of 2014, however, the cloud in Germany had rapidly spread out and this shift continues in 2015.

An analogy to the cloud’s growth can be made with the triumphant spread of WLAN at the enterprise and can provide evidence as to why the cloud finally enforces itself. Earlier, CIOs were being ridiculed for daring to introduce WLAN infrastructure and open their businesses to the outside world. Psychology and technological developments have made it possible to dismiss further claims that WLAN is unsecure.

The Cloud is the foundation for Digital Transformation

For close to 75 percent of the German enterprises, cloud computing occupies a significant place on the IT agenda. The cloud is either an active component of the ongoing IT operations and deployed as part of projects and workloads, or being used for planning and implementation purposes. We see a clear trend toward hybrid cloud solutions. More than half of the survey respondents (57 percent) rely on this operational cloud model. Moreover, in the context of multi-cloud strategy, managed private cloud environments (57 percent) gain increasing importance.

The hybrid cloud stays in focus, especially because enterprises are intensely occupied with the topic of „Data Gravity“. This comes down to data immobility. Either the data volume is too large to store in the cloud, or legal frameworks necessitate the keeping of specific data on premise. A hybrid cloud architecture is in a good position to address the challenges of „Data Gravity“ and offer unique solutions. In this case, data mobility is no longer necessary. Instead, the data are kept on private storage systems (for example, in the managed private cloud) and the public cloud services (compute power, applications, etc.) then access the data for processing. In the course of processing via the use of public cloud services, the data have indeed been modified and, as the case may be, new data have been generated. However, the data never leave the storage system in the direction of the public cloud.

Only around 25 percent of the German decision makers do not find any place for the cloud on their IT agenda. It is a matter of wait and see how long this stays the case. The pressure on the different departments for more flexibility and a shorter time-to-market for new applications is increasingly stronger. At the end, it all boils down to the CIOs not being able to move forward without using cloud services, in whatever form they may be. And who would let oneself to be reproached for negligently playing the wrong note with the digital transformation opportunity?

Among those businesses that actively dedicate themselves to the digital transformation, and accordingly build a Digital Infrastructure Fabric (DIF), the cloud platforms and infrastructure are a central part of IT strategy in supporting the digital efforts. This is indicated by the open presence of Infrastructure-as-a-Service and Platform-as-a-Service on the agenda of respectively 63 percent and 70 percent of CIOs, and their use as a foundation for the development and operations of new applications.

Furthermore, the Internet of Things (IoT) will spur forward the German cloud market and quickly become a decisive factor for the future competitive advantage of businesses who must quickly get busy and grapple with the necessary technologies. Public cloud environments – especially infrastructure (IaaS) and platforms (PaaS) – offer the ideal prerequisites for back-end support of IoT services and end devices. The critical attributes for this aim have already been placed in the cradles of leading public cloud providers, to be further developed into an IoT backend. One can then rightfully adhere to the view that cloud growth in Germany and the advancement of the Internet of Things are closely interrelated.

Provider Overview

An overview of the behavior of the most important public cloud providers is presented in the current research note: „Public Cloud Providers in Germany: Frankfurt is the Stronghold of Cloud Data Centers“.

Halfway 2015: Six Cloud Trends in Germany are already confirmed

In the beginning of the year, Crisp Research outlined ten trends in the cloud computing market in Germany. Six of them are already a reality.

  • The Public Cloud arrived in Germany
    The public cloud model is gaining massive ground in Germany. At the AWS Summit in Berlin, strong and comprehensive projects from companies like Audi, Zalando and Zanox spoke the same distinct language. The public cloud enjoys towering popularity as IT users use it to bring agility and flexibility into application management. Then again, by this time almost all relevant public cloud providers boast data centers on German grounds.
  • Consultants and System Integrators benefit from the cloud boom
    The complexity associated with public cloud infrastructure as well as the missing cloud expertise and development skills in the German enterprise brings cloud system integrators into the game. This has been confirmed through conversations with decision makers from the respective companies like Direkt Gruppe, TecRacer or Beck et. al. The somewhat EUR 2.9bn market is divided by the system integrators and consultants who are truly committed to the cloud topic and make a significant contribution to the digital transformation in Germany.
  • Multi Cloud is now a reality
    The use of cross-functioning cloud deployments has experienced strong growth. As part of cloud strategy, the evaluation of and operations from at least two cloud providers are considered. One reason for this is risk management. Another, main reason is that not every provider is capable of handling all applications – e.g. bare metal or legacy IT. At present, the customer administers workloads individually on each cloud provider’s platform. In the mid-term, the management will make a shift to a more centralized platform that simplifies deployment.
  • Frankfurt ensures cloud connectivity and performance
    Cloud connectivity is of paramount importance for both the user and the provider, as technical challenges (low latency, high availability and throughput) must be tackled to ensure high-performing, stable and secure application operations and services. Frankfurt is the core of the German and European cloud markets. A look into the presently leading public cloud providers in the German market reveals that already half of them have selected Frankfurt as a data center location, affirming the relevance of cloud connectivity and performance. Amazon Web Services, ProfitBricks, SoftLayer and VMware are already present in Frankfurt, to be followed by Salesforce in August.
  • The Internet of Things drives Mobile Backend Development
    In the course of digitalization, the development of mobile apps in the context of the Internet of Things mounts up. The data gathered from end devices will be transferred to back-end infrastructure, where back-end applications will undertake the analysis and linkage with other data, to prepare for the visualization of a front end. Mobile backends become IoT Backends and an important part of the IoT value chain. What is now known in the market as Fitness Wearables, which many people use to self-measure certain indicators, will spread to other industries. A lot of movement has been observed in the Smart Home space. Providers like Tado (intelligent heating and air conditioning management) or Netatmo (weather station) use backend applications based on cloud infrastructure that takes care of integrated connections of end devices and secure control.
  • More Services – Pricing rises
    Price reductions belong to the past. Now the price barometer points to the other direction. First, Microsoft will increase prices for Azure in the European region from August 1st. The focus is deemed to be increasingly more on innovation. Amazon AWS released 220 new services and functions this year, Microsoft Azure about 110, respectively. ProfitBricks also recognized that pure infrastructure has already become a commodity and offers very little potential for innovation. Instead, the customers require enablement of cloud infrastructure, on top of which to build their own products and services. ProfitBricks presented DevOps Central, a portal to inspire developers to use the infrastructure environment. Furthermore, proprietary SDKs for Java and Go were introduced, to aid the developer to manage the ProfitBricks infrastructure components via programmable REST APIs.

Marketing Gimmick: „Cloud Made in Germany“

What is „The German Cloud“ doing? Nothing! The marketing about „The German Cloud“ really needs to stop. The German customers have never asked for a German cloud. This naming is the creative idea of a couple of German marketing managers.

Instead of applauding „The German Cloud“, German providers should rather use their strengths to develop and bring to market innovative and appealing cloud services. The market leadership and the clear innovative advantage of the US providers versus the German ones present a grave problem. At the end, the availability and the competitive advantage of the German cloud services is really shrinking. Many enterprises are snatching the offerings of the cloud providers from the US.

The fact is that a large part of German CIOs ask for German data centers, in order to guarantee the level of data protection and fulfil the legal requirements. Certainly, enterprises should not neglect the significance of global scalability, in order to expand to other geographies. Having a provider with a global footprint is imperative. Here is a reason why the IT and cloud strategies must be aligned from the beginning.

Bottom Line: Digitalization drives Cloud Computing

The German businesses find themselves in the middle of their cloud transformation processes. They lay out a multi-cloud strategy step by step, based on existing infrastructure, platforms and services from different providers. In the course, CIOs assess technologies and providers for the creation of their Digital Infrastructure Fabric (DIF), which will further play out the technological implementation of the individual digital strategy and will put the foundation of novel business models and agile business processes.

The cloud has acquired leading status as a vehicle for the digital transformation. Only by means of deploying dynamic and globally scalable platforms and infrastructure can the IT strategy adequately address the evolving market conditions and support the business strategy in an agile way from a technical perspective.


Shared-Responsibility in the Public Cloud (Webcast Recording in German)

0
0

Public cloud and responsibility is a difficult topic. However, most of the companies still have to understand that self-responsibility is a central point using the public cloud. The provider does his homework up to a certain level and delivers services and tools the customer has to use to do his homework. Thus, the complexity mostly sticks in the architecture of the infrastructure respectively in the application running on the infrastructure. However, a customer cannot drop the full responsibility into the hands of the provider.

“Shared-Responsibility” is key in the public cloud. How this works and how companies have to deal with this in terms of operation and security I discussed together with Amazon Web Services German country manager Martin Geier and Zalando STUPS Hacker Henning Jacobs during the Computerwoche live webcast “Security First in the Public Cloud“.

The recording of the webcast is now online and can be watched for free under “Webcast mit Zalando und AWS: Security First in der Public Cloud” in German.


Picture source & credits: Amazon Web Services Germany

The Big Misunderstanding: Shared Responsibility in the Public Cloud

0
0

Responsibility in the public cloud is a story of several misunderstandings. Advisory sessions and conversations with different companies interested in public cloud unveil the certainty that the classical outsourcing concept is still widely spread among IT decision makers. Public cloud providers are being seen as full service providers. That complicates negotiations at eye level and blocks the quick adoption of public cloud services. „Shared Responsibility“ is the keyword that needs to be internalized. This research note clarifies the wrong assumptions and describes the concept.

Self Responsibility: The Big Misunderstanding

In the past 10 years, for the sake of convenience cloud computing was often defined as “Outsourcing 2.0”. What should have led to a better understanding on the user side, however, did public cloud providers a disservice. With the understanding in mind – an external service provider takes over responsibility for (partly all) IT operations – IT decision makers developed the expectations that public cloud providers are full service providers. The IT department just coordinates and controls the external service provider.

What is true for a software-as-a-service (SaaS) provider as a vendor of low-hanging fruits is completely different at platform-as-a-service (PaaS) and in particular at infrastructure-as-a-services (IaaS) level. SaaS providers are delivering ready developed and ready-to-use applications. The complexity, for example with solutions from Salesforce and SAP, comes with the configuration, customization and, if necessary, the integration with other SaaS providers. So, the SaaS provider is responsible for the deployment and the entire operations of the software, and the necessary infrastructure/ platform. The customer is consuming the application. PaaS providers are deploying environments for the development and operations of applications. Via APIs, the customer gets access to the platform and can develop and operate its own applications and provide those to his own customers. Thus, the provider is responsible for the deployment and the operations of the infrastructure and the platform. The customer is 100 percent responsible for his application but doesn’t have any influence on the platform or the infrastructure. IaaS providers only take responsibility at infrastructure level. Everything that is happening at higher levels is in the customer’s area of responsibility.

Thus, it is wrong to see public cloud providers such as Amazon Web Services, Microsoft Azure or VMware (vCloud Air) as full service providers who take whole responsibility for the entire stack – from infrastructure up to application level. Self responsibility is required instead!

Shared Responsibility: This is how IaaS Management works in the Public Cloud

A decisive public cloud detail that contrasts this deployment model clearly from outsourcing is the self -service. Depending on their DNA, the providers are only taking responsibility for specific areas. The customer is responsible for the rest.

In the public cloud, furthermore, it is about sharing responsibilities – referred to as Shared Responsibility. The provider and its customer divide the field of duties among themselves. In doing so, the customer’s self-responsibility plays a major role. In the context of IaaS utilization, the provider is responsible for the operations and security of the physical environment. He is taking care of:

  • Setup and maintenance of the entire data center infrastructure.
  • Deployment of compute power, storage, network and managed services (like databases) and other microservices.
  • Provisioning the virtualization layer customers are using to demand virtual resources at any time.
  • Deployment of services and tools customers can use to manage their areas of responsibility.

The customer is responsible for the operations and security of the logical environment. This includes:

  • Setup of the virtual infrastructure.
  • Installation of operating systems.
  • Configuration of networks and firewall settings.
  • Operations of own applications and self-developed (micro)services.

A very important part is security. The customer is 100 percent responsible for securing his own environment. This includes:

  • Security of operating systems, applications and own services.
  • Encryption of data, data connections as well as ensuring the integrity of systems based on authentication mechanisms as well as identity and access controls at system and application level.

Thus, the customer is responsible for the operations and security of his own infrastructure environment and the systems, applications, services, as well as stored data on top of it. However, providers like Amazon Web Services, Microsoft Azure or VMware vCloud Air provide comprehensive tools and services customers can use e.g. to encrypt their data as well as ensure identity and access controls. In addition, enablement services (microservices) exist that customers can adopt to develop own applications more quickly and easily.

By doing this, the customer is all alone in its area of responsibility and thus has to take self-responsibility. However, constantly growing partner networks are helping customers to set up virtual infrastructures in a secure way and run applications and workloads on top of public clouds.

@CIO: Public Cloud means stopping with Antiquated Traditions

In addition to requiring an understanding of the shared responsibility concept, using public cloud infrastructure also makes imperative the rethinking of the infrastructure design as well as the architecture of the corresponding applications and services.

During the way to public cloud infrastructure, the self-service initially looks simple. However, the devil is in the detail and hides in the complexity that is not obvious at first. That is why CIOs should focus on the following topics from the start:

  1. Understand the respective provider portfolio and the characteristics of the platform/ infrastructure. This sounds easy. However, public cloud infrastructure environments are developing at enormous speed. For this purpose, it is necessary to know the range of functions and the availability of all services on the infrastructure platform and train the employees on a rolling basis to exploit the full potential.
  2. Focus on a greenfield approach including a microservice architecture. Public cloud infrastructures are following completely different architecture and design concepts as compared to those taught and implemented just some years ago. Instead of developing monolithic applications, cloud infrastructure is set on so called microservice architecture to develop independent, loose coupled and individually scalable applications that are integrated to create an entire application. This ensures better scalability and leads to a higher availability of the entire application.
  3. Consider „design for failure“. „Everything fails, all the time“ (Werner Vogels, CTO Amazon.com). The design of a cloud application has to follow the rules and characteristics of cloud computing and consider high availability. In doing so, one has to make sure to avoid a single point of failure and know that something can go wrong at any time. Thus, the goal is to create an application that works anytime even if the provider’s underlying physical infrastructure starts having issues. Therefore, the providers offer the necessary tools and services.
  4. Use existing best practices and operational excellence guidelines for the virtual environment. Leading cloud users like Netflix impressively show how to handle self-responsibility, respectively shared responsibility, in the public cloud. In doing so, Netflix has developed its „Simian Army“. This is a huge set of tools and services they are using to ensure the highly-available operations of the virtual Netflix infrastructure on top of the cloud of the Amazon Web Services. Zalando makes similar steps by developing his own STUPS.io framework.
  5. Consider managed public cloud providers. The complexity of the public cloud shouldn’t be underestimated. This applies for setting up the necessary virtual infrastructure, the development of applications as well as for the operations and the holistic implementation of all security mechanisms. More and more system integrators like Direkt Guppe, TecRacer or Beck et al. Services specialize in the operations of public clouds. In addition, former web hosting providers and MSPs like Rackspace (whose Fanatical Support is now available for Microsoft Azure) transform to managed public cloud providers. And many more will follow!

The growing number of cloud migration projects at big medium-size companies and enterprises indicate that public cloud infrastructure platforms are becoming the new norm, while old architecture, design and security concepts are being replaced. After public clouds have been ignored over several years, this deployment model now also makes its way on the digital infrastructure agenda of IT decision makers. However, only CIOs with a changing mindset taking the shared responsibility concept for granted will successfully make use of the public cloud.

Analyst Strategy Paper: Generation Cloud – The Market for MSPs and System Integrators in Transition

0
0

The market for system integrators and managed services providers is undergoing a substantial change. Only those who start their cloud transformation as early as possible will be able to survive on the market in the long run. The reason for this development is the changing purchase behavior of IT decision-makers. They are looking for more flexibility when it comes to the use of IT resources. So system integrators and managed service providers are faced with a fundamental change of their core business and need to upgrade the skill set of their employees as quickly as possible to the cloud-ready status. In this context public cloud infrastructures offer ideal conditions with regard to the price-performance ratio for running customer systems and applications in a managed services model. This way a faster response to changing customer requirements and to varying market situations is possible. System integrators and managed services providers can benefit from the high availability, the scalability and a high security standard of public cloud infrastructures. As a consequence they can free themselves from their capital-intensive business (shift from CAPEX model to the OPEX model) and thus design their pricing and marketing models more flexibly.

In the strategy paper “Generation Cloud – The Market for MSPs and System Integrators in Transition”, Crisp Research analyses the new role of MSPs and System Integrators in the age of the cloud.

The strategy paper can be downloaded free of charge under “Generation Cloud – The Market for MSPs and System Integrators in Transition“.

Contribution to the book “SAP on the Cloud”

0
0

In the end of 2014 Dr. Michael Missbach got in touch with me telling me about his plans to update his book “SAP on the Cloud“. The book gives a general overview of SAPs strategy and products around cloud computing and also covers (internal) cloud projects and ideas by SAP. One of his interests was the SAP Monsoon project, which was established and propelled by Jens Fuchs. I am covering the project as part of my OpenStack and open source cloud research at Crisp.

… and that is how it came that I have contributed the “SAP Monsoon” chapter to the book “SAP on the Cloud”.

The book can be ordered under http://www.springer.com/br/book/9783642436048.

Analyst Strategy Paper: Public Cloud – The Key to a Successful Digital Transformation

0
0

Within the framework of the digital agenda, IT infrastructure is of central importance. More than two thirds (68 percent) of companies regard digital infrastructure as the most important building block and the key to the successful digitization of their business models and processes. The Public Cloud is one of the most important vehicles of the digital evolution. Only by means of dynamically acting and globally scalable infrastructure are companies able to adapt their IT strategies to continuously changing market situations. Hence, they can strongly support the technical aspects of their company strategy. With a Digital Infrastructure Fabric, companies are mapping the technological image of their “Digital Enterprise“, defining all necessary players and drivers within their digital evolution.

Public Cloud infrastructure services represent a solid base, to support the digitization strategies of companies regardless of their size, predominantly however, companies with very scalable IT workloads. For example, startups are allowed to grow slowly without having to invest massively in IT resources from the very beginning. In this way, companies get a hold on one of the most important features, to have a say in the digital evolution: speed. Today for IT departments, there is more at stake than just preserving the status quo. IT must position itself as a strategic partner and business enabler and be capable of satisfying the individual needs of specialized departments. They need to pursue the goal of creating a competitive edge for the company on the basis of digital technologies. In this context, public cloud infrastructure support the proactive measures of the IT departments.

The strategy paper can be downloaded free of charge under “Public Cloud – The Key to a Successful Digital Transformation“.

Analyst Strategy Paper: How to resist Data Gravity

0
0

As a result of growing digitization of business models and processes, CIOs and IT decision-makers are compelled to seek new sourcing models and infrastructure concepts. The Public Cloud plays a major role in this context. In the digital age, the significance of data management takes on a new dimension. Data have certain inertia and, hence need to be assigned to different classes so that they comply with regulations, legal requirements, technical limitations and individual safety classes. This so-called „Data Gravity“ impacts the mobility of data. New storage concepts are needed to process these hard-to-move data out-side of one‘s own IT infrastructure without loss of control. Hybrid and multi-cloud storage architectures provide implementation strategies and robust usage scenarios that are aligned with the new requirements. Within these architecture concepts, the data are located in a company-controlled area and the data owner is the only person who determines which parts are to be stored in the public cloud. Accordingly, all benefits of public cloud infrastructures can be utilized without losing control of one‘s data, while fulfilling the necessary compliance guidelines and legal requirements.

In the strategy paper “How to resist Data Gravity”, Crisp Research analyses and explains the connection between “data gravity” and the public cloud and illustrates new ways to reduce the impact.

The strategy paper can be downloaded free of charge under “How to resist Data Gravity“.

Analyst Strategy Paper: Cloud Data Fabric – Enterprise Storage Services in the Cloud

0
0

The maturing public cloud infrastructure gains increasing importance as an attractive alternative to on-premise enterprise IT infrastructure. Public cloud infrastructure offer an abundance of possibilities to CIOs when it comes to operating existing infrastructure and application environments more flexibly and at a lower cost. However, the existing public cloud storage services have been developed with a focus on the new generation of applications. This is why they are less well prepared to run existing enterprise applications. At present, the requisite storage concepts, standards and technologies for use of legacy enterprise applications on public cloud infrastructure are still not available. In order to ensure the continued existence as well as the proper operations of conventional application architectures on public cloud infrastructures, it is necessary to transfer the established and widespread standards into the public cloud. As legacy applications still constitute the lion‘s share of potential cloud migration candidates for new types of cloud native applications, well-known storage concepts are required to ensure the migration of existing applications without modifications to a public cloud infrastructure, where they continue to be operated.

In the strategy paper “Cloud Data Fabric”, Crisp Research analyses and explains the different enterprise storage options in the cloud and illustrates how to establish storage management with public cloud infrastructure.

The strategy paper can be downloaded free of charge under “Cloud Data Fabric – Enterprise Storage Services in the Cloud“.


Analyst Strategy Paper: Service Management in the Public Cloud

0
0

In the coming years the Public Cloud will inevitably continue to take hold. From a technical point of view, the use of dynamic infrastructure is the only means to respond to ever changing market situations and to address them in a proactive fashion.

The black box Public Cloud makes it harder for IT organizations to keep sight of the big picture and to live up to their supervisory obligations. This becomes evident mainly through the lack of close ties to the actual IT operations of the cloud infrastructure.

The use of Public Cloud infrastructure is based on the shared responsibility model in which the responsibilities are clearly separated between the provider (physical environment) and his clients (logical environment).

In addition to the full responsibility for the logical environment, the customer does not only need to find an answer to the question of how to handle the black box – the physical environment – but also how to measure the services of the cloud provider at this level in order to maintain control.

With ITIL, CIOs have a powerful framework at their disposal which enables them to monitor the public cloud provider at all levels. Through established ITIL procedures, they can provide the business side with the facts that are required for reporting.

The strategy paper can be downloaded free of charge under “Service Management in the Public Cloud – Sourcing within the Digital Transformation“.

Analyst Study Report: Evaluation of IoT Backend Providers

0
0

The question of how production, logistics and value chains can be further optimized or reshaped into new business models by sensing technology and smart analytics is currently among the most important items on the agenda of decision-makers in the technology and industry segment.

The Internet of Things (IoT) is representative for the interconnection of physical objects, that besides human beings, also include sensors, household devices, cars, industrial equipment and much more. The IoT bridges the gap between the digital and the analog world as it aims for maximum interconnection and the largest possible exchange of information.

In the era of the Internet of Things a significant part of the “Product Value Function” is defined by new software functions and data services. Additionally, this offers the opportunity to develop completely new business models and the exploration of new revenue sources.

This is especially apparent in the development phase of cloud infrastructure and platforms which are some of the central drivers behind IoT services since they offer ideal conditions and as crucial enablers for backend services. These in turn create value-added services for their customers and partners. Against this backdrop, Crisp Research has carried out this research project in order to support CIOs, CTOs and CEOs with the selection and evaluation of relevant IoT backend providers.

Considered and analyzed providers: Amazon Web Services, Microsoft Azure, VMware, Atos, Huawei, IBM, Salesforce, SAP, T-Systems/ Deutsche Telekom, Fujitsu, Vodafone, Tata Consultancy Services, Telefónica, Google, Oracle and QSC

The study report can be requested under “Crisp Vendor Universe: Evaluation of IoT Backend Providers“.

Expert Panel: “Digitizing the Energy World” at AWS Summit Berlin 2016

Analyst Panel Discussion at OpenStack Summit Austin 2016

0
0

Analyst Q&A panel at OpenStack Summit Austin 2016 on “What the Analysts Who Cover OpenStack Really Think”, featuring Paul Miller (Forrester), Roz Roseboro (Heavy Reading), Colm Keegan (ESG), Steve O’Grady (Redmonk) and Rene Buest (Crisp Research). The panel discussed the communications efforts of the OpenStack Foundation as well as the community and provided a direct and unvarnished feedback.

Panel Discussion: Analyst Q&A – What the Analysts Who Cover OpenStack *Really* Think from Rene Buest on Vimeo.

Analyst Report: IT Infrastructure 2020 – Enterprises in a fully Interconnected World

0
0

The future IT Infrastructure are fundamentally different from those of today. These do not just have a cloud based character, but they also have special requirements regarding the scope, performance, stability and must ensure a maximum density of interconnection. In context with the planning of their IT infrastructure agenda 2020, CIOs should deal with certain topics in order to support the business activities from a technology perspective.

If you are interested in reading the report “IT Infrastructure 2020 – Enterprises in a fully Interconnected World” please get in touch with me to receive an issue of this report.

Interview: Public Cloud Services – In search of the white knight

0
0

Hybrid and multi-cloud strategies are near the top of the agenda for IT decision-makers. They understand that a modern, cloud-based IT world shouldn’t just be drawn in black and white. Diversity is needed to purchase services and innovations from a larger number of cloud providers. Private clouds quickly meet their limits here and don’t offer the benefits of a public cloud.

What is the optimal strategy for using public cloud services in enterprise IT? Find the answers in an interview with T-Systems in “Public cloud services: in search of the white knight”.

Mr. Buest, public, private, hybrid: when does which cloud offering become relevant for a company?
There’s no catch-all answer here. We are now seeing an increasing number of companies that are intensely engaging with the public cloud, following an “all in” approach. This means, they do not manage a local IT infrastructure or internal data centers anymore, instead they are migrating everything to public cloud infrastructures or platforms, or purchasing what they need under a SaaS (Software-as-a-Service) model. However, these companies are still a minority.

…and that means?
At the moment, most companies prefer to use private cloud environments. It’s a logical consequence of the legacy solutions that companies still maintain in their IT. However, we believe that in the future, a majority of German companies will move to hybrid or multi-cloud architectures, enabling them to cover all the facets they need for their digital transformation.

And how can companies coordinate these different solutions in combination?
By using cloud management solutions that have interfaces to the most commonplace public cloud offers, as well as to private cloud solutions. They provide powerful tools for managing workloads in different environments and shifting virtual machines, data and applications around. Another option for seamless management is iPaaS: integration Platform as a Service (iPaaS) provides cloud-based integration solutions. In the pre-cloud era, such solutions were also called “middleware”. They provide support for the interaction between different cloud services.

What do companies have to watch out for principally when using these cloud services?
They should not underestimate the lack of understanding of the public cloud, nor the challenges associated with setting up and operating multi-cloud environments. The benefits gained from using multi-cloud infrastructures, platforms and services often come at a heavy price: namely, the costs that result from the complexity, integration, management and necessary operations. Multi-cloud management and a general lack of cloud experience are currently the key challenges many companies are facing.

What is the solution?
Managed public cloud providers (MPCPs) are positioning themselves as “white knights” or “friends in need”. They develop and operate the systems, applications and virtual environments for their customers – in both the public cloud infrastructures and multi-cloud environments – in a managed cloud service model.

– – –

– – –
The interview with T-Systems has been published under “Public cloud services: What really matters“.

Interview: Innovation and scalability in the public cloud

0
0

In this interview with Cloud Era Institute, I discuss the growing trend of companies opting into the public cloud to leverage scalable infrastructure and global outreach. I also share how vendor lock-in contributes to innovation. I provide an important distinction between data privacy and data security, and explain why the public cloud is a shared responsibility.

What trends are you seeing in the public cloud right now?

The first trend is that the public cloud is growing because enterprises need innovation and global scalability. Previously enterprises talked about building private cloud environments, but soon realized the financial impact of building a massive, scalable infrastructure. Last year, Amazon Web Services (AWS) released over seven hundred new services and functionalities, something that would not be possible for a private cloud or a web hosting company. Amazon and Microsoft Azure are investing heavily in innovations at the infrastructure level, in data center operating, and new services.

Another big trend we are seeing are containers like Docker, which has gained momentum because of the importance of portability. With Docker, you can move workloads to different cloud providers. You can capitulate your application and its dependencies in a container, then move everything from one system to another.

A third trend is microservices such as Azure machine learning or AWS Short Notification Service. You can use the microservice approach to create your own powerful application.

Netflix is an on-demand video streaming platform with massive scalability and a highly available application on top of AWS, created based on microservice architecture. Their application is always running because when one microservice has a problem, the others remain unaffected. It is a single application running on top of AWS and connected via a public API.

CIOs and developers typically don’t like vendor lock-in, but I believe it helps with innovation. If you are using an iPhone, then you are totally locked in the Apple environment, and you love it. Apple is able to innovate because they have a closed ecosystem. It’s the same with AWS and Azure, since they also have service lock-in. This is how companies are able to innovate.

What has been the biggest challenge for businesses in the public cloud?

For Germany, Europe, and the U.S., it is data privacy and security. It is important to separate data privacy and data security. Data privacy is about legality issues and ensuring that you are fulfilling the law. Data security means that data is stored securely so nobody can access it without authorization.

Germany thinks its data centers are more secure than the U.S., which is not true. Data centers in Germany, the U.S., or Australia have the same physical security. When it comes to data security, it is no big deal to store data in the U.S.

Another big issue is a lack of cloud knowledge. The cloud has been around more than ten years, yet there is still a global lag. Many people do not understand how to create cloud applications that can be used on cloud platforms; from the design, to the architecture, microservices, and containers.

Public clouds are shared-security environments. There is a lack of knowledge about this as well. A public cloud provider is only responsible for the physical infrastructure and ensuring that the virtual infrastructure can be deployed.

Everything on top of the virtual infrastructure belongs to the customer. In the public cloud, the customer has to create their own virtual infrastructure, for example on top of AWS, and then has to run systems and applications on top of it. To fire up a virtual machine is not cloud. The application and virtual infrastructure must also be scalable.

What have observed about marketing as it relates to the cloud?

It is not only a cloud issue; it is that many companies do not focus on content marketing. It is better to market your products with good content, not just advertisements. Unique content is just as important as having an expert voice contributing to it. It is better to let the people write who are experts, not the marketing people.

– – –

– – –
The interview with Cloud Era Institute has been published under “Global technology expert, Rene Buest, on innovation and scalability in the public cloud“.


I am leaving Crisp Research

0
0

In February 2014, I joined Crisp Research as senior analyst & cloud practice lead, taking over the responsibility for the entire cloud computing, IT infrastructure, IT platforms, IT automation and Internet of Things research. The last years it was fun to help propagating the brand “Crisp Research” and its awareness in the German and international markets from scratch.

I had the honor to work with a team of talented people including Maximilian Hille, Bjoern Boettcher, Meike Buch et al. The team and his support helped me to publish 93 research notes as well as 45 studies, strategy papers and analyst reports, including:

I also had the honor to speak at several national and international summits and vendor conferences resulting in

Not to forget the German and international media like Computerwoche, CIO Magazin, LANline, Silicon.de, New York Times, Forbes Magazin, Handelsblatt, Die Zeit, Frankfurter Allgemeine Zeitung, Wirtschaftswoche, Manager Magazin, Harvard Business Manager et al. who cited me as part of their articles – leading to

Now, after almost three years I am leaving Crisp Research. I want to say thank you to all my Crisp Research fellows, clients, analyst relations, PR agencies and journalists I’ve been working with over the years. It was a great pleasure to collaborate with all of you and I hope we have the chance to stay in touch.

Rene

Joining AI pioneer Arago as Director Market Research & Technology Evangelism

0
0

I’ve kept it as a secret and less than a handful of people knew it so far. On January 1, I’ve joined artificial intelligence (AI) pioneer Arago as “Director Market Research & Technology Evangelism”. In my role, I am responsible for analysis & content as well as the areas market intelligence & analyst relations and am one of Arago’s official spokespersons for media, at events and for keynotes. I am very excited about this new gig since I’ll work with a team of smart and highly talented people and can still make use of my 7+ years experiences as a technology analyst.

About Arago and HIRO

Furthermore, Arago is one of the most compelling companies in the AI field focusing on building tools for the B2B sector using AI and analytical components. Started in 1995 founder and CEO Chris Boos managed it to maintain Arago’s start-up mentality giving the entire company an agile pathway.

Arago’s AI-driven core solution HIRO is built to run and automate the entire IT stack from IT operations up to business processes supported by sophisticated algorithms. HIRO’s autonomic problem solving AI leverages its algorithm to analyze large problem sets, apply XML-code based Knowledge Items (KIs) and past experiences to find a solution and retain gained knowledge to future proof against similar challenges. This knowledge is not only retained for reference but can be applied to address different issues and deployed in dynamic environments. Through its utilization of memory and contextual thinking, HIRO simulates human problem solving behavior.

HIRO and the Freeciv Strategy Challenge

In the world of AI it has become best practice to demonstrate a system’s capability using a game. Arago and HIRO are taking the gaming challenge by competing against Freeciv players from around the world. Arago decided to let HIRO play Freeciv since

  • the number of possible games in Freeciv is exponentially higher compared to e.g. Go or Chess.
  • Freeciv is not a perfect information game that gives players insight into the whole world and a view of opponents’ moves.
  • Freeciv imitates real world randomness.

In addition, HIRO is primed to address this challenge because instead of simply using trial and error to statistically understand “what” needs to be done, it employs human intelligence to understand “why”.

If you want to learn more about my role, Arago and HIRO just get in touch with me via email, Twitter or check frequently out where you can meet me in person.

Analyst Strategy Paper: Public Cloud – The Key to a Successful Digital Transformation

0
0

Within the framework of the digital agenda, IT infrastructure is of central importance. More than two thirds (68 percent) of companies regard digital infrastructure as the most important building block and the key to the successful digitization of their business models and processes. The Public Cloud is one of the most important vehicles of the digital evolution. Only by means of dynamically acting and globally scalable infrastructure are companies able to adapt their IT strategies to continuously changing market situations. Hence, they can strongly support the technical aspects of their company strategy. With a Digital Infrastructure Fabric, companies are mapping the technological image of their “Digital Enterprise“, defining all necessary players and drivers within their digital evolution.

Public Cloud infrastructure services represent a solid base, to support the digitization strategies of companies regardless of their size, predominantly however, companies with very scalable IT workloads. For example, startups are allowed to grow slowly without having to invest massively in IT resources from the very beginning. In this way, companies get a hold on one of the most important features, to have a say in the digital evolution: speed. Today for IT departments, there is more at stake than just preserving the status quo. IT must position itself as a strategic partner and business enabler and be capable of satisfying the individual needs of specialized departments. They need to pursue the goal of creating a competitive edge for the company on the basis of digital technologies. In this context, public cloud infrastructure support the proactive measures of the IT departments.

The strategy paper can be downloaded free of charge under “Public Cloud – The Key to a Successful Digital Transformation“.

Analyst Strategy Paper: How to resist Data Gravity

0
0

As a result of growing digitization of business models and processes, CIOs and IT decision-makers are compelled to seek new sourcing models and infrastructure concepts. The Public Cloud plays a major role in this context. In the digital age, the significance of data management takes on a new dimension. Data have certain inertia and, hence need to be assigned to different classes so that they comply with regulations, legal requirements, technical limitations and individual safety classes. This so-called „Data Gravity“ impacts the mobility of data. New storage concepts are needed to process these hard-to-move data out-side of one‘s own IT infrastructure without loss of control. Hybrid and multi-cloud storage architectures provide implementation strategies and robust usage scenarios that are aligned with the new requirements. Within these architecture concepts, the data are located in a company-controlled area and the data owner is the only person who determines which parts are to be stored in the public cloud. Accordingly, all benefits of public cloud infrastructures can be utilized without losing control of one‘s data, while fulfilling the necessary compliance guidelines and legal requirements.

In the strategy paper “How to resist Data Gravity”, Crisp Research analyses and explains the connection between “data gravity” and the public cloud and illustrates new ways to reduce the impact.

The strategy paper can be downloaded free of charge under “How to resist Data Gravity“.

Figure: Chatbot Cloud Provider Overview

0
0

Chatbot Cloud Provider Overview

Viewing all 87 articles
Browse latest View live




Latest Images