Some Storage Trends to Watch
It is expected that in the near future the volumes of video materials and the data collected from various sensors will sharply increase. Recent trends will lead to changes in the approach to information storage and management. IT administrators and directors must plan ahead for future changes and be prepared for them.
Data Management Standardization
Each responsible company seeks to unlock the capabilities of its data, but first they need to be made more manageable. In this case, traditional tools fade into the background: first of all, the enterprise needs powerful equipment and the most standardized information management.
For example, companies should centralize the administration of existing storage systems, if possible, ideally manage them through a single interface. Data will be much easier to sort, control and use if it can be universally interpreted.
Another trend is storage automation in order to increase efficiency, reduce administration costs and eliminate errors. For example, analytics using AI and machine learning can predict occupancy, highlight rarely used data to move to other storage levels, and identify potential risks.
The growth of the global storage market with AI functions (AI-Powered Storage). According to Market Research Future
Solutions based on artificial intelligence and machine learning for managing and analyzing large volumes of data will be widely used. Especially given the complexity and variety of data continues to grow.
Hybrid storage systems and tiered data storage (tearing)
Many companies prefer a combination of on-premises storage and cloud platforms. Where fast access to large amounts of data is required, SANs or other local systems are still required. However, the cloud is more suitable for backup and archiving. In order to optimize the distribution of resources, tiered storage mechanisms are used, which automatically determine the optimal place for data placement.
Used storage technologies – now and in two years. According to the forecast of Spiceworks Research
Artificial intelligence and fast data storage
Another trend that can have a profound impact on data storage solutions is the development of artificial intelligence technologies. Here, large amounts of information come into play, especially at the machine / deep learning stage: existing data is checked for the presence of certain characteristics by the AI system, which is then “trained” accordingly.
Storage growth for AI loads in the world. Source: IDC
Wherever GPU-based computing systems are used, the fast exchange of data between AI algorithms and the underlying storage system is crucial. Ultimately, the same principles apply here: you need to find the right combination of on-premises and cloud storage systems.
Local data centers for faster communications
Cloud providers are increasingly having to provide the fastest connection to their corporate infrastructure. For this reason, new data centers, such as Microsoft or Amazon data centers, are located “closer to the user.” This helps to eliminate, or at least minimize problems with a slow connection to the cloud server.
This also applies to small regional cloud providers, which are much more decentralized than the Azure data center infrastructure or AWS. They need a good internet connection, but it’s easier to get with the help of small local data centers. Regional providers of this type are a reasonable compromise in terms of cost and performance. They can act as points of high-speed connection to public clouds in the implementation of multi-cloud solutions.
Using cloud storage services – now and in two years. According to the forecast of Spiceworks Research
Many companies use several public cloud services in conjunction with their local infrastructure. This affects the processes of moving, migrating data and organizing access of applications to information. Cloud-ready storage systems are designed to provide data portability between multiple cloud platforms, as well as between the on-premises platform and the cloud.
Backup and recovery solutions must meet new requirements
A constant increase in the number of storage levels will affect backup and recovery: “recovering” petabytes of lost data is much more difficult than gigabytes or terabytes. This applies equally to archiving, although, of course, this process is less critical in time than restoring from backup. Other factors, such as smart indexing or metadata storage, will play a decisive role here, because you need to simplify the search for unstructured data as much as possible – for example, video content.
High Performance Computing – For Midsize Businesses
In the near future, even medium-sized enterprises will not be able to function effectively without HPC solutions. If earlier high-performance computing was mainly done by universities and state-owned information centers, now the situation has changed. As data grows, HPC solutions will be needed wherever computing and simulation applications that require large storage capacities are used.
For example, a large design office with very complex calculations will need such solutions for calculating and visualizing three-dimensional objects. Due to the large amount of data involved in this process, such work would either be extremely time-consuming or could not be carried out at all.
New trends in the data storage industry include object storage for improved indexing and metadata allocation, as well as SCM (storage-class memory) modules for faster access to information using intelligent tearing mechanisms. Thanks to SCM drives, delays will be reduced by up to 50%. Non-volatile memory with a performance close to the speed of RAM, can significantly accelerate the processing of large data sets.
In addition, flash technology in the form of SSD components will continue to crowd out classic hard drives in the enterprise environment.
SSD and NVMe
Companies will continue to replace traditional HDDs with solid-state drives to improve performance, simplify management and reduce system power consumption. New generations of flash arrays will offer more advanced means of automation and data protection, as well as integration with public clouds.
On a much larger scale, the NVMe protocol will be deployed. Lenovo’s array of storage arrays already includes NVMe-enabled storage to increase the performance of the DM Series disk arrays.
The DM7000F All Flash array was the first product on the market that allowed for the end-to-end NVMe over Fabric solution from server to storage
PCIe SSDs using the NVMe specification for access protocols for solid state drives connected via the PCI Express bus are one of the main trends in the development of storage technologies.
The forecast for the growth of the total capacity of different types of drives supplied in the world. Source: IDC
According to IDC’s forecast, by 2021, flash arrays with NVMe and NVMe-oF connections will provide about half of all revenues from the supply of external storage systems. Analysts believe that NVMe-oF has great potential, because this specification provides extremely high throughput with ultra-low latency and paves the way for creating distributed storage systems with a low-latency factory.
With the advancement of technology, Lenovo intends to protect customers’ investments, as existing systems can support new protocols without replacing hardware. And the latest Lenovo NVMe platform is direct proof of this. Storage systems with NVMe drives demonstrate their best qualities in working with databases, therefore, to deliver applications with the lowest possible latencies, the DM7100 array supports NVMe pluggable drives. This allows you to significantly increase storage performance in IOPS and build a powerful and scalable storage.
Now, of course, an exciting time for the data storage industry: a new generation of storage systems is already entering the game, and next innovations are appearing on the horizon. All of them become the answer to the urgent needs for working with clouds, automation, as well as optimization of management and data processing tasks.