data center

Results 1 - 25 of 2310Sort Results By: Published Date | Title | Company Name
Published By: gMed     Published Date: Dec 16, 2014
One of the most important decisions your gastroenterology practice can make is how you select and roll out the systems that carry your data. You need to run your practice efficiently and meet the requirements CMS (Centers for Medicare and Medicaid Services) established so you can benefit from Meaningful Use incentives and prevent future penalties. After all, healthcare is changing rapidly, and keeping up with new regulations and technology can be a full-time job. With healthcare reform’s key provisions kicking in, accountable care organizations forming, ICD-10 coding soon to be a requirement and the importance of exporting data to registries growing, physicians have an extraordinary amount to address.
Tags : 
    
gMed
Published By: SunGard Availability Services     Published Date: Mar 31, 2014
This whitepaper summarizes an independent study that looked into the IT needs and direction of the Healthcare Provider Sector. You will find key findings from that study which may help improve data center resiliency for those in the Healthcare Provider Sector. The study included over 250 Healthcare IT professionals (both business and technology influencers and decision makers) employed at Healthcare Providers across North America with annual revenues of $100M+ and/or 100+ beds.
Tags : 
    
SunGard Availability Services
Published By: GNAX Health     Published Date: Oct 01, 2013
Healthcare providers today are shifting how they manage their IT departments. With the rise of EHRs and other IT systems, healthcare providers are finding themselves needing to outsource their data center solutions in order to maintain functionality. Colocation of some or all of healthcare’s IT needs alleviates IT burdens, reduces overhead and increases operating efficiencies. And perhaps most importantly, it provides healthcare organizations with a solid solution for disaster recovery. Disaster recovery colocation aids healthcare organizations in reducing costs while meeting HIPAA requirements which call for the backup of mission critical data in two diverse locations. For most healthcare providers, maintaining one reliable datacenter is costly enough. Adding a second redundant site in a diverse location can be exorbitant. This whitepaper defines the importance of disaster recovery colocation. It includes nine critical criteria for selecting the ideal technology partner and geogra
Tags : 
    
GNAX Health
Published By: Nutanix     Published Date: Aug 22, 2019
Nutanix created hyperconverged infrastructure years ago because there was an urgent need for innovation within enterprise infrastructure. IT silos, management complexity, and gross inefficiencies were undermining the customer experience. It was time for a paradigm shift, which is why Nutanix melded webscale engineering with consumer-grade design to fundamentally transform the way organizations consume and leverage technology.
Tags : 
    
Nutanix
Published By: Nutanix     Published Date: Aug 22, 2019
There is more to the cloud than meets the eye. This ????????????????age ????ourney hel????s you understand enter????rise cloud and ho???? it ????ts into your datacenter ????aradigm. By the end of this ????ook, you ????ill see ho???? enter????rise cloud can hel???? you ????ro????el your ????usiness into the ????????nd century.
Tags : 
    
Nutanix
Published By: NTT Ltd.     Published Date: Aug 05, 2019
VMware provides IT organizations a path to digital transformation, delivering consistent infrastructure and consistent operations across data centers and public clouds to accelerate application speed and agility for business innovation and growth.
Tags : 
    
NTT Ltd.
Published By: CloudHealth by VMware     Published Date: Sep 05, 2019
Public clouds have fundamentally changed the way organizations build, operate, and manage applications. Security for applications in the cloud is composed of hundreds of configuration parameters and is vastly different from security in traditional data centers. According to Gartner, “Through 2020, at least 95% of cloud breaches will be due to customer misconfiguration, mismanaged credentials or insider theft, not cloud provider vulnerabilities”1. The uniqueness of cloud requires that security teams rethink classic security concepts and adopt approaches that address serverless, dynamic, and distributed cloud infrastructure. This includes rethinking security practices across asset management, compliance, change management, issue investigation, and incident response, as well as training and education. We interviewed several security experts and asked them how public cloud transformation has changed their cloud security and compliance responsibilities. In this e-book, we will share the top
Tags : 
    
CloudHealth by VMware
Published By: Group M_IBM Q3'19     Published Date: Jul 01, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
    
Group M_IBM Q3'19
Published By: Group M_IBM Q3'19     Published Date: Sep 04, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
    
Group M_IBM Q3'19
Published By: Group M_IBM Q3'19     Published Date: Sep 04, 2019
In the last few years we have seen a rapid evolution of data. The need to embrace the growing volume, velocity and variety of data from new technologies such as Artificial Intelligence (AI) and Internet of Things (IoT) has been accelerated. The ability to explore, store, and manage your data and therefore drive new levels of analytics and decision-making can make the difference between being an industry leader and being left behind by the competition. The solution you choose must be able to: • Harness exponential data growth as well as semistructured and unstructured data • Aggregate disparate data across your organization, whether on-premises or in the cloud • Support the analytics needs of your data scientists, line of business owners and developers • Minimize difficulties in developing and deploying even the most advanced analytics workloads • Provide the flexibility and elasticity of a cloud option but be housed in your data center for optimal security and compliance
Tags : 
    
Group M_IBM Q3'19
Published By: Infinidat EMEA     Published Date: May 14, 2019
Reinventing Data Centers at Petabyte Scale to Enable Competitive Advantage
Tags : 
    
Infinidat EMEA
Published By: Infinidat EMEA     Published Date: May 14, 2019
Data continues to grow at an astounding pace? As a result, data center space is becoming more scarce, as more arrays are acquired to store all of this data. Along with this data taking up space, it is also utilizing a great deal of power and cooling. In fact, the average data center in the U.S. uses approximately 34,000 kW of electricity each year, costing $180,000 in annual energy costs. As Infinidat set out to revolutionize the storage industry, one of our goals was to help consumers of storage build a more sustainable infrastructure that would be not only better for the environment, but also help them to save money as well. All of our patents come together to form InfiniBox, a storage solution that does just this.
Tags : 
    
Infinidat EMEA
Published By: Flexential     Published Date: Jul 17, 2019
By 2020, there will be 50 billion devices producing 600Zb of data, driving enterprises to assess their IT infrastructures to keep pace with unprecedented data processing demands. Is your business ready to handle this data explosion? In this on-demand webinar, you’ll hear Tim Parker, Vice President of Network Services at Flexential, and Craig Matsumoto, Senior Analyst of Data Center Services at 451 Research, discuss the evolution of data – and how enterprises are processing and creating it. Download now to hear Parker and Matsumoto examine how edge computing solutions are meeting the latency demands of an increasingly digitized world.
Tags : 
    
Flexential
Published By: Flexential     Published Date: Jul 17, 2019
The hybrid cloud has arrived for the enterprise, but it comes with a complication: the speed of light. Between the cloud and the end user (including IoT devices that count as ‘users’), there is an emerging need for an intermediate environment that can satisfy real-time compute requirements without incurring the latency of reaching all the way to the cloud. ‘Edge compute’ is the phrase used to describe what covers this middle ground, but the optimal location for edge compute resources remains open to question.
Tags : 
    
Flexential
Published By: Flexential     Published Date: Jul 17, 2019
In a data environment that’s become increasingly centralized by public cloud services, the “edge” is emerging as a critical solution for reducing latency for network-based services. Consumption habits of services and the need for analytics are shifting beyond core population centers, becoming local and even hyper-local within a region or city. As the online population continues to grow and new services emerge, the ability to handle data traffic securely – close to the customer or application – will become a common pattern for the new service evolution.
Tags : 
    
Flexential
Published By: Dell EMC     Published Date: Aug 01, 2019
In the Principled Technologies datacenter, we tested the All-Flash Dell EMC SC5020 storage array and the HPE Nimble Storage AF5000 array to see how well they performed while handling two workloads at once. The Dell EMC array handled transactional database workloads and data mart imports better than the HPE solution without sacrificing performance. Download this whitepaper from Dell and Intel® to learn more.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Aug 01, 2019
Greater performance, agility and security are the new imperatives of the modern data center, and according to the Edison Group, Dell EMC PowerEdge servers outperform HPE servers. PowerEdge servers ? with Intel® Xeon® Scalable processors ? have customer-centric innovation that is better able to meet the advance demands of IT Transformation, today and tomorrow. Download this infographic to learn the 5 reasons why.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Aug 01, 2019
Pursuing agility to truly impact business transformation requires embracing date center modernization as a core competency. Crucial to this is having the most up-to-date IT instructure to support the scale and complexity of a changing technology landscape. Companies must embrace this imperative by adopting software-defined data center principles, embracing modernization, and automating their IT management processes. Those that do will propel business innovation and deliver superior customer experiences with fast, secure, and reliable business technology. Download this whitepaper from Dell and Intel® to learn more.
Tags : 
    
Dell EMC
Published By: Cisco and NVIDIA Corporation     Published Date: Sep 09, 2019
Application performance and delivery have changed. Should your network change too? Cloud is changing the fundamentals of how IT teams deliver applications and manage their performance. Applications are increasingly deployed farther from users, crossing networks outside of IT’s direct control. Instead of enterprise data centers, many apps now reside in public and hybrid cloud environments. There are even new breeds of applications, built upon microservices and containers. Today, IT needs modern solutions that: ? Extend on-premises networks, apps, and infrastructure resources to the cloud. ? Maintain high levels of performance, user experience, and security across all applications, including microservices based apps. ? Sustain operational consistency across on-premises and cloud environments. ? Move away from the expense, complexity, and poor performance of traditional networking methods. These solutions are available for apps running on Google Cloud Platform (GCP) through the allia
Tags : 
    
Cisco and NVIDIA Corporation
Published By: Infosys     Published Date: Sep 12, 2019
Digital-born companies have challenged large long-established businesses across industries with newer data, AI-powered experiences, products/services. Sustained competitive advantage through customer ownership and seller power has since been significantly challenged and overturned. Customers are taking to newer AI and data-powered products/services in their pursuit of better experiences and exponentially higher value. This has triggered every company to challenge status-quo, unleash themselves from very structure of industry and embrace transformation in the new world. Data and AI have shaped themselves into a major economic force that is at the epicenter of transformation of every industry; through 3 horizons. Data, in the first horizon, was the key ingredient in driving more data-driven decisions. Data, in the second horizon, is playing a transformational role in the enterprises' pursuit of being Data Native Digital Native enterprise.
Tags : 
    
Infosys
Published By: Apstra     Published Date: Aug 19, 2019
Typically, and since the dawn of time, network infrastructure teams chose the hardware and switch Operating System (OS) first, then designed their infrastructure, including how their infrastructure was built around this choice.
Tags : 
    
Apstra
Published By: Apstra     Published Date: Aug 19, 2019
Yahoo Japan Corporation, one of Japan’s largest Internet service providers, is deploying Clos fabric networks for efficiently addressing their ever-growing data center traffic. Apstra® was selected for streamlining the design, build, and operations of these networks.
Tags : 
    
Apstra
Published By: ASG Software Solutions     Published Date: Nov 05, 2009
Effective workload automation that provides complete management level visibility into real-time events impacting the delivery of IT services is needed by the data center more than ever before. The traditional job scheduling approach, with an uncoordinated set of tools that often requires reactive manual intervention to minimize service disruptions, is failing more than ever due to todays complex world of IT with its multiple platforms, applications and virtualized resources.
Tags : 
asg, cmdb, bsm, itil, bsm, metacmdb, workload automation, wla, visibility, configuration management, metadata, metacmdb, lob, sdm, service dependency mapping, ecommerce, bpm, workflow, itsm, critical application
    
ASG Software Solutions
Published By: Upsite Technologies     Published Date: Sep 18, 2013
The average computer room today has cooling capacity that is nearly four times the IT heat load. Using data from 45 sites reviewed by Upsite Technologies, this white paper will show how you can calculate, benchmark, interpret, and benefit from a simple and practical metric called the Cooling Capacity Factor (CCF). Calculating the CCF is the quickest and easiest way to determine cooling infrastructure utilization and potential gains to be realized by AFM improvements.
Tags : 
ccf, upsite technologies, cooling capacity factor, energy costs, cooling, metrics, practical, benchmark
    
Upsite Technologies
Published By: Alcatel-Lucent     Published Date: Dec 02, 2009
Faster, more powerful and dense computing hardware generates significant heat and imposes considerable data center cooling requirements.
Tags : 
data center design and management, distributed computing, internetworking hardware, power and cooling
    
Alcatel-Lucent
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search