0
The future of offline computing in data centers
2018-09-17T10:38:10.574Z17 September 2018

The future of offline computing in data centers

The revival of autonomous computing is a step towards the construction of SDDC. Today, the stage of widespread introduction of virtualization in the data centers has already been completed, the introduction of new mechanisms, which in the past did not have a chance for implementation is on the way now.

"The opportunity appeared only today," says Charles Crouchman, technical director of Turbonomic. - Now the task is to learn how to automatically manage the work of the data center. Until now, management has been carried out by collecting statistics on the operation of subsystems and their analysis for reconfiguration. Thanks to the model of autonomous computing, you can move from point corrections to continuous control".

In this regard, it is interesting to mention the recent joint project of Zenoss, the developer of IT monitoring and analytics systems, and SaltStack, specializing in the creation of automated configuration management systems. The purpose of their project is to create a mechanism that improves the quality of management of systems operating in SDDC. The future solution should be able to scale the load of the data center in the light of changes, intelligently switch the configurations of the running systems, take into account the events and the planned schedule of resource consumption.

ServiceNow company has a vision of future automation. Its product ServiceWatch Suite can be used as a single tool for managing configurations based on accumulated monitoring data. The collected characteristics are displayed on the information panel, and this gives an overview of the quality of service functions. Using the data, you can manage the resources of public and private clouds, carry out a set of organizational and technical measures for their reconfiguration and provision, monitor the quality of services, implement event management.

The spread of new tools is hampered by the reluctance of companies to spend money on functions whose reliability has not yet been confirmed, as well as the lack of their own experience in maintaining the IT infrastructure. Nevertheless, new products are being actively studied by the market. First of all, many are interested in how quickly it is possible to get return of such systems.



Views
9
Shares
0
Comments
0

Comments

Latest news
How to become LIR in 7 days

There is an Internet infrastructure that includes switches, routers, which require a fairly large number of ..

30 August 2018
How to avoid mistakes when choosing a hosting

Everyone says that they learn from mistakes, but sometimes these mistakes can lead to very large losses. The..

17 September 2018
What is the reason for the global increase in the nu..

In April 2017, there were 320 hyper-scalable data centers in the world, and in December their number was 390..

15 November 2018
How likely is it that your site will "fall" during t..

Holidays are a special time for many areas of activity. For some companies it's a dead season, for others it..

5 June 2018

Do you like cookies? 🍪 We use cookies to ensure you get the best experience on our website. By using our website you agree with our policy!

I AGREE