workload data

Results 1 - 25 of 255Sort Results By: Published Date | Title | Company Name

IDC MarketScape: Worldwide All Flash Array 2017 Vendor Assessment

Published By: NetApp APAC     Published Date: Jul 04, 2019
This IDC study provides an evaluation of 10 vendors that sell all-flash arrays (AFAs) for dense mixed enterprise workload consolidation that includes at least some mission-critical applications. "All-flash arrays are dominating primary storage spend in the enterprise, driving over 80% of that revenue in 2017," said Eric Burgener, research director, Storage. "Today's leading AFAs offer all the performance, capacity scalability, enterprise-class functionality, and datacenter integration capabilities needed to support dense mixed enterprise workload consolidation. More and more IT shops are recognizing this and committing to 'all flash for primary storage' strategies."
Tags : 
    
NetApp APAC

Gartner for AI

Published By: IBM APAC     Published Date: Jul 19, 2019
It’s important to understand the impact of AI workloads on data management and storage infrastructure. If you’re selecting infrastructure for AI workloads involving ML and deep learning, you must understand the unique requirements of these emerging workloads, especially if you’re looking to use them to accelerate innovation and agility. This Gartner report highlights three main impacts that AI workloads have on data management and storage.
Tags : 
    
IBM APAC

Secure Hybrid Cloud for Dummies

Published By: Group M_IBM Q3'19     Published Date: Aug 12, 2019
Welcome to Secure Hybrid Cloud For Dummies, IBM Limited Edition. The hybrid cloud is becoming the way enterprises are transforming their organizations to meet changing customer requirements. Businesses are discovering that in order to support the needs of customers, there is an imperative to leverage the highly secure IBM Z platform to support missioncritical workloads, such as transaction management applications. The Z platform has been transformed over the years. The combination of z/OS, LinuxONE, open APIs, and the inclusion of Kubernetes has made IBM Z a critical partner in the hybrid cloud world. Businesses can transform their IBM Z environments into a secure, private cloud. In addition, through IBM’s public cloud, businesses may take advantage of IBM Z’s security services to protect their data and applications.
Tags : 
    
Group M_IBM Q3'19

Overcoming Petabyte-Scale Storage Challenges for Big Data and Analytics

Published By: Infinidat EMEA     Published Date: May 14, 2019
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already under pressure, Big Data footprints are getting larger and posing a huge storage challenge. This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
Tags : 
    
Infinidat EMEA

Infinidat® Open Stack Solutions

Published By: Infinidat EMEA     Published Date: May 14, 2019
Infinidat has developed a storage platform that provides unique simplicity, efficiency, reliability, and extensibility that enhances the business value of large-scale OpenStack environments. The InfiniBox® platform is a pre-integrated solution that scales to multiple petabytes of effective capacity in a single 42U rack. The platform’s innovative combination of DRAM, flash, and capacity-optimized disk, delivers tuning-free, high performance for consolidated mixed workloads, including object/Swift, file/Manila, and block/Cinder. These factors combine to cut direct and indirect costs associated with large-scale OpenStack infrastructures, even versus “build-it-yourself” solutions. InfiniBox delivers seven nines (99.99999%) of availability without resorting to expensive replicas or slow erasure codes for data protection. Operations teams appreciate our delivery model designed to easily drop into workflows at all levels of the stack, including native Cinder integration, Ansible automation pl
Tags : 
    
Infinidat EMEA

Get Rid of Database Workload Silos: Dell EMC SC5020 vs HPE Nimble

Published By: Dell EMC     Published Date: Aug 01, 2019
In the Principled Technologies datacenter, we tested the All-Flash Dell EMC SC5020 storage array and the HPE Nimble Storage AF5000 array to see how well they performed while handling two workloads at once. The Dell EMC array handled transactional database workloads and data mart imports better than the HPE solution without sacrificing performance. Download this whitepaper from Dell and Intel® to learn more.
Tags : 
    
Dell EMC

Why Server Hardware Matters: Recent Survey Evaluates Server Adoption and Market Misperceptions

Published By: Dell EMC     Published Date: Aug 01, 2019
Software might run the world, but software still runs on hardware. It’s a misperception that hardware has little value anymore. Every application, every workload, every data set runs on physical servers. Read “Hardware Does Matter: Global Server Brands are Perceived as Superior for Driving Digital Business,” a Frost & Sullivan report of 500 IT decision makers, on the value of global server brands vs. commodity servers. Look beyond commodity status to discover: • Key server purchase criteria • How top brands directly compare • How to choose based on workload Server brands very significantly, and a commodity brand may not provide the outcomes you need, especially for new and next-generation applications. Download this analyst report from Dell EMC and Intel® to learn more.
Tags : 
    
Dell EMC

Time for Prime Time: Effective Data Management for NoSQL and Hadoop Environments

Published By: Cohesity     Published Date: Aug 09, 2019
As organizations continue to look for ways to increase business agility, a need for a modern database architecture that can rapidly respond to the needs of business is more apparent than ever. While an RDBMS still serves as a lifeline for many organizations, the adoption of technologies such as NoSQL and Hadoop are enabling organizations to best address database performance and scalability requirements while also satisfying the goals of embracing hybrid cloud and becoming more data-driven. And with organizations relying so heavily on these new technologies to yield rapid insights that positively impact the business, the need to evaluate how those new technologies are managed and protected is essential. Hadoop and NoSQL workloads are now pervasive in production environments and require “production-class” data protection, yet few data protection solutions offer such capabilities today.
Tags : 
    
Cohesity

Data Protection for Modern Workloads: Protecting Office 365 with Cohesity

Published By: Cohesity     Published Date: Aug 09, 2019
In a context of mass data fragmentation on-premises and in the cloud, organizations now struggle with the compounded complexities brought about by modern workloads such as containers, NoSQL/NewSQL databases, and SaaS applications. These new workloads are turning traditional backup and recovery approaches on their head—in particular, in Microsoft Office 365 deployments for which new backup, recovery, and data management schemas must be deployed.
Tags : 
    
Cohesity

Protect and Manage Secondary Data and Apps in a Hybrid Cloud Environment

Published By: Cohesity     Published Date: Aug 09, 2019
IT organizations everywhere are undergoing significant transformation to keep pace with the needs of their businesses. They’re tasked with consolidating data centers and migrating both workloads and data to the cloud. The transition has been easier for some than others. As hybrid architectures increasingly become the norm, how are enterprises gaining complete visibility, simplifying management, and making use of all of their data—both on-premises and in the cloud? Five enterprises explain how they’ve replaced multiple products that created legacy data silos with Cohesity – a single, hyperconverged softwaredefined platform with native Microsoft Azure integration for simplified secondary data and applications. For them, Cohesity and Azure together boost IT agility while lowering costs, solving critical secondary data challenges from long-term retention, storage tiering, test/dev, disaster recovery and cloud-native backup in a proven hybrid cloud architecture.
Tags : 
    
Cohesity

Solve Hybrid Cloud Challenges for Secondary Data and Apps

Published By: Cohesity     Published Date: Aug 09, 2019
Data for secondary workloads – backup, test/dev, disaster recovery, and archiving to name a few – have become siloed the same way application data has, leading to multiple point solutions to manage an increasing amount of data. This white paper looks at the evolution of these challenges and offers practical advice on ways to store, manage and move secondary data in hybrid cloud architectures while extracting the hidden value it can provide.
Tags : 
    
Cohesity

Business-Critical Benefits of Workload Automation, by EMA

Published By: ASG Software Solutions     Published Date: Nov 05, 2009
Effective workload automation that provides complete management level visibility into real-time events impacting the delivery of IT services is needed by the data center more than ever before. The traditional job scheduling approach, with an uncoordinated set of tools that often requires reactive manual intervention to minimize service disruptions, is failing more than ever due to todays complex world of IT with its multiple platforms, applications and virtualized resources.
Tags : 
asg, cmdb, bsm, itil, bsm, metacmdb, workload automation, wla
    
ASG Software Solutions

Archiving and SAP Environments: For Business, IT, and Regulatory Requirements

Published By: ASG Software Solutions     Published Date: Feb 24, 2010
A recent survey of CIOs found that over 75% want to develop an overall information strategy in the next three years, yet over 85% are not close to implementing an enterprise-wide content management strategy. Meanwhile, data runs rampant, slows systems, and impacts performance. Hard-copy documents multiply, become damaged, or simply disappear.
Tags : 
asg, cmdb, bsm, itil, bsm, metacmdb, archiving, sap
    
ASG Software Solutions

Secrets to BSM Success: End-to-End Visibility and Customer Focus

Published By: ASG Software Solutions     Published Date: Feb 23, 2010
There are success stories of businesses that have implemented Business Service Management (BSM) with well-documented, bottom-line results. What do these organizations know that their discouraged counterparts don't?
Tags : 
asg, cmdb, bsm, itil, bsm, metacmdb, archiving, sap
    
ASG Software Solutions

Creating the Data Center of the Future with Hyperconverged Infrastructure

Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Most IT professionals today recognize that enterprise IT will be hybrid in the future. To provide the optimal foundation for each workload being deployed, the hybrid IT environment will include cloud-based infrastructures—from multiple providers—co-existing alongside infrastructure within the enterprise data center or a hosted environment. But not all hyperconverged solutions yield the same results. The right hyperconverged infrastructure can meet your IT needs both today and well into the future. In this paper, we will talk about where your data center needs to be in the next five years to meet changing business demands, and how the roles of IT professionals will evolve. We will also review “hyperconvergence” models, and how they can best meet your IT needs both today and in the future, as well as the benefits you can expect along the way. Finally, we discuss what to look for in the right hyperconverged provider, who will position your IT department for success.
Tags : 
    
Hewlett Packard Enterprise

ESG: HPE 3PAR Flash Now: Accelerating All-flash Data Center Transformation

Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Over the past several years, the IT industry has seen solid-state (or flash) technology evolve at a record pace. Early on, the high cost and relative newness of flash meant that it was mainly relegated to accelerating niche workloads. More recently, however, flash storage has “gone mainstream” thanks to maturing media technology. Lower media cost has resulted from memory innovations that have enabled greater density and new architectures such as 3D NAND. Simultaneously, flash vendors have refined how to exploit flash storage’s idiosyncrasies—for example, they can extend the flash media lifespan through data reduction and other technique
Tags : 
    
Hewlett Packard Enterprise

Technical overview of HPE 3PAR File Persona Software technical white paper

Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Today’s data centers are expected to deploy, manage, and report on different tiers of business applications, databases, virtual workloads, home directories, and file sharing simultaneously. They also need to co-locate multiple systems while sharing power and energy. This is true for large as well as small environments. The trend in modern IT is to consolidate as much as possible to minimize cost and maximize efficiency of data centers and branch offices. HPE 3PAR StoreServ is highly efficient, flash-optimized storage engineered for the true convergence of block, file, and object access to help consolidate diverse workloads efficiently. HPE 3PAR OS and converged controllers incorporate multiprotocol support into the heart of the system architecture
Tags : 
    
Hewlett Packard Enterprise

Langton Blue: HPE 3PAR Adaptive Data Reduction A competitive comparison of array-based data

Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Modern storage arrays can’t compete on price without a range of data reduction technologies that help reduce the overall total cost of ownership of external storage. Unfortunately, there is no one single data reduction technology that fits all data types and we see savings being made with both data deduplication and compression, depending on the workload. Typically, OLTP-type data (databases) work well with compression and can achieve between 2:1 and 3:1 reduction, depending on the data itself. Deduplication works well with large volumes of repeated data like virtual machines or virtual desktops, where many instances or images are based off a similar “gold” master.
Tags : 
    
Hewlett Packard Enterprise

Hidden Costs of Virtualization Backup Solutions, Revealed

Published By: Commvault     Published Date: Jul 06, 2016
Today, nearly every datacenter has become heavily virtualized. In fact, according to Gartner as many as 75% of X86 server workloads are already virtualized in the enterprise datacenter. Yet even with the growth rate of virtual machines outpacing the rate of physical servers, industry wide, most virtual environments continue to be protected by backup systems designed for physical servers, not the virtual infrastructure they are used on. Even still, data protection products that are virtualization-focused may deliver additional support for virtual processes, but there are pitfalls in selecting the right approach. This paper will discuss five common costs that can remain hidden until after a virtualization backup system has been fully deployed.
Tags : 
storage, backup, recovery, best practices, networking, it management, enterprise applications, data management
    
Commvault

Fastest Speed, Highest Security: The Co-engineering of Oracle's New M8 Systems

Published By: Oracle CX     Published Date: Oct 19, 2017
Oracle has just announced a new microprocessor, and the servers and engineered system that are powered by it. The SPARC M8 processor fits in the palm of your hand, but it contains the result of years of co-engineering of hardware and software together to run enterprise applications with unprecedented speed and security. The SPARC M8 chip contains 32 of today’s most powerful cores for running Oracle Database and Java applications. Benchmarking data shows that the performance of these cores reaches twice the performance of Intel’s x86 cores. This is the result of exhaustive work on designing smart execution units and threading architecture, and on balancing metrics such as core count, memory and IO bandwidth. It also required millions of hours in testing chip design and operating system software on real workloads for database and Java. Having faster cores means increasing application capability while keeping the core count and software investment under control. In other words, a boost
Tags : 
    
Oracle CX

Secure Cloud Infrastructure

Published By: Oracle CX     Published Date: Oct 19, 2017
Business Enterprises today need to become more agile, meet new and increasing workload and security requirements, while reducing overall IT cost and risk. To meet these requirements many companies are turning to cloud computing. To remain competitive companies need to formulate a strategy that can easily move them from traditional on-premises IT to private or public clouds. A complete cloud strategy will likely include both private and public clouds because some applications and data might not be able to move to a public cloud. Moving to the cloud should not create information silos but should improve data sharing. Any cloud strategy should make sure that it is possible to integrate on-premises, private cloud and public cloud data and applications. Furthermore, any on-premises cloud deployments must be able to easily migrate to public cloud in the future
Tags : 
    
Oracle CX

The Most Advanced Systems for Cloud and Scale-out: SPARC S7

Published By: Oracle CX     Published Date: Oct 19, 2017
Business Enterprises today need to become more agile, meet new and increasing workload and security requirements, while reducing overall IT cost and risk. To meet these requirements many companies are turning to cloud computing. To remain competitive companies need to formulate a strategy that can easily move them from traditional on-premises IT to private or public clouds. A complete cloud strategy will likely include both private and public clouds because some applications and data might not be able to move to a public cloud. Moving to the cloud should not create information silos but should improve data sharing. Any cloud strategy should make sure that it is possible to integrate on-premises, private cloud and public cloud data and applications. Furthermore, any on-premises cloud deployments must be able to easily migrate to public cloud in the future.
Tags : 
    
Oracle CX

Eddison White Paper: Advantages & Efficiencies of Oracle SPARC S7 Server Over Commodity Alternatives

Published By: Oracle CX     Published Date: Oct 20, 2017
This whitepaper explores the new SPARC S7 server features and then compares this offering to a similar x86 offering. The key characteristics of the SPARC S7 to be highlighted are: ? Designed for scale-out and cloud infrastructures ? SPARC S7 processor with greater core performance than the latest Intel Xeon E5 processor ? Software in Silicon which offers hardware-based features such as data acceleration and security The SPARC S7 is then compared to a similar x86 solution from three different perspectives, namely performance, risk and cost. Performance matters as business markets are driving IT to provide an environment that: ? Continuously provides real-time results. ? Processes more complex workload stacks. ? Optimizes usage of per-core software licenses. Risk matters today and into the foreseeable future, as challenges to secure systems and data are becoming more frequent and invasive from within and from outside. Oracle SPARC systems approach risk management from multiple perspectiv
Tags : 
    
Oracle CX

Next-Generation Secure Infrastructure Platform - Oracle SPARC M8 Launch Webcast

Published By: Oracle CX     Published Date: Oct 20, 2017
Oracle has just announced a new microprocessor, and the servers and engineered system that are powered by it. The SPARC M8 processor fits in the palm of your hand, but it contains the result of years of co-engineering of hardware and software together to run enterprise applications with unprecedented speed and security. The SPARC M8 chip contains 32 of today’s most powerful cores for running Oracle Database and Java applications. Benchmarking data shows that the performance of these cores reaches twice the performance of Intel’s x86 cores. This is the result of exhaustive work on designing smart execution units and threading architecture, and on balancing metrics such as core count, memory and IO bandwidth. It also required millions of hours in testing chip design and operating system software on real workloads for database and Java. Having faster cores means increasing application capability while keeping the core count and software investment under control. In other words, a boost
Tags : 
    
Oracle CX

LinuxONE: A Secure, Scalable Data-Serving Infrastructure Accelerates Digital Transformation

Published By: IBM     Published Date: Jun 29, 2018
LinuxONE from IBM is an example of a secure data-serving infrastructure platform that is designed to meet the requirements of current-gen as well as next-gen apps. IBM LinuxONE is ideal for firms that want the following: ? Extreme security: Firms that put data privacy and regulatory concerns at the top of their requirements list will find that LinuxONE comes built in with best-in-class security features such as EAL5+ isolation, crypto key protection, and a Secure Service Container framework. ? Uncompromised data-serving capabilities: LinuxONE is designed for structured and unstructured data consolidation and optimized for running modern relational and nonrelational databases. Firms can gain deep and timely insights from a "single source of truth." ? Unique balanced system architecture: The nondegrading performance and scaling capabilities of LinuxONE — thanks to a unique shared memory and vertical scale architecture — make it suitable for workloads such as databases and systems of reco
Tags : 
    
IBM
Start   Previous   1 2 3 4 5 6 7 8 9 10 11    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.