dataset

Results 1 - 25 of 29Sort Results By: Published Date | Title | Company Name

Infographic: Mobile Priorities in the Attention Economy

Published By: Quantum Metric     Published Date: Oct 18, 2019
Attaining your customers’ undivided attention is a key challenge for any online business. This challenge is amplified, however, when they explore your brand via their mobile device. Using Quantum Metric’s unique dataset, this infographic showcases the importance of keeping your mobile visitors engaged. Mobile visitors purchase 4X less than desktop visitors among Fortune-500 companies.
Tags : 
    
Quantum Metric

Delivering Information Faster: In-Memory Technology Reboots the Big Data Analytics World

Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP

Amazon Redshift Spectrum: expert tips for maximizing the power of Spectrum

Published By: AWS     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
AWS

Ten Things You Need to Know About Data Virtualization

Published By: TIBCO Software     Published Date: Jan 17, 2019
Are you considering data virtualization for your organization today? In this paper you will learn 10 core truths about data virtualization and gain essential knowledge for overcoming analytic data bottlenecks and driving better outcomes.
Tags : 
virtualization, data, analytics, datasets, software, access, integration, projects
    
TIBCO Software

Pro tips for backing up large datasets

Published By: Carbonite     Published Date: Apr 09, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data. Variables abound. Is it critical or non-critical data? A simple file deletion or a system-wide outage? A physical server running onsite or a virtual one hosted offsite? These and a handful of other criteria will determine your backup and disaster recovery (BDR) deployment. What do we mean by large? A simple question with a not-so-simple answer. If your total data footprint is 5 TB or more, that’s considered large. But what kind of data is it? How many actual files are there? How frequently do they change? How much can they be compressed? It’s likely that two different 5 TB en
Tags : 
    
Carbonite

How Digital 2.0 Is Driving Banking’s Next Wave of Change

Published By: Cognizant     Published Date: Oct 23, 2018
In the last few years, a wave of digital technologies changed the banking landscape - social/ mobile altered the way banks engage with customers, analytics enabled hyper personalized offerings by making sense of large datasets, Cloud technologies shifted the computing paradigm from CapEx to OpEx, enabling delivery of business processes as services from third-party platforms. Now, a second wave of disruption is set to drive even more profound changes - including robotic process automation (RPA), AI, IOT instrumentation, blockchain distributed ledger and shared infrastructure, and open banking platforms controlled by application programming interfaces (API). As these technologies become commercialized, and demand increases for digitally-enabled services, we will see unprecedented disruption, as non-traditional banks and fintechs rush into all segments of the banking space. This whitepaper examines key considerations for banks as they explore value in the emerging Digital 2.0 world.
Tags : 
cognizant, banking, digital
    
Cognizant

Wasabi Ushers in Cloud Storage 2.0, the next generation of storage needs.

Published By: Wasabi     Published Date: Oct 23, 2017
An explosion of data storage needs, both in terms of volume and accessibility, are unmet by first-generation storage solutions. The massive datasets being generated are un-storable due to costs and unable to be fully leveraged because of speed limitations. The needs of individual businesses, and our greater economy, demand the commoditization of cloud storage. Cloud Storage 2.0 represents a new generation of solutions that promise to turn Cloud Storage into a utility along the lines of bandwidth and electricity. Leading this evolution with high-speed, low cost, reliable cloud storage is Wasabi. In this white paper we look at the genesis and possibilities of Cloud Storage 2.0, and Wasabi’s place at its forefront. Free trial with no credit card required offer available as well.
Tags : 
wasabi, cloud storage, data storage, storage solutions
    
Wasabi

Pro Tips for Backing Up Large Data Sets

Published By: Carbonite     Published Date: Oct 10, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data.
Tags : 
    
Carbonite

Assuring Loan Quality Through the Loan Completion Process

Published By: Fiserv     Published Date: Nov 07, 2017
"In today’s ever-evolving lending landscape where loan quality and risk management challenge profitability and the customer experience, technology may be the key to thriving – both now and in the future. Winning financial services institutions will be the ones that transform their business models to place loan quality and risk management at the center of their operations. To facilitate continuous life-of-loan management, inclusive of the requisite data transparency and audit trails that support loan quality and loss mitigation, these institutions will implement and automate a loan completion process. Such a process will manage data quality and access to loan data and documents throughout origination, servicing and sale on the secondary market."
Tags : 
mortgage data quality, loan quality, loan data quality, mortgage quality, loan compliance, lending compliance, mortgage compliance, trid
    
Fiserv

How Automation Can Simplify Mortgage Loan Boarding

Published By: Fiserv     Published Date: Nov 07, 2017
Learn how loan onboarding can become more efficient and accurate by eliminating manual data validation with automation technology that is poised to transform mortgage servicing. From end-to-end, tools can simplify workflow processes, driving time and cost efficiencies. Trained staff can be deployed to greater effect and can be crucial to eliminating servicing errors. In the process, servicers improve data quality, save time and money, and deliver a better borrower experience.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, lending efficiency
    
Fiserv

A Return to Fundamentals: Five Ways Mortgage Bankers Can Lower Costs

Published By: Fiserv     Published Date: Nov 07, 2017
"Recently, a number of factors have come together to decimate the profitability of the mortgage banking industry. To regain its footing, the industry must return to mortgage banking fundamentals. This paper carefully examines each function within the mortgage business to determine if there is a better approach that will save money and improve long-term profitability."
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid
    
Fiserv

From Origination to Delivery: Improving Loan Data Quality and Compliance

Published By: Fiserv     Published Date: Nov 07, 2017
"Improve Loan Data Quality and Compliance from Origination to Delivery. This complimentary CEB Gartner paper helps identify process and technology issues that lead to loan defects. Learn strategies for fixing issues and recommends technologies to help lenders improve loan data quality and compliance to reduce costs and improve the borrower experience. "
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid
    
Fiserv

Improving Data Quality & Making Compliance Checks More Efficient and Effective with Automation

Published By: Fiserv     Published Date: Nov 09, 2017
Digital loan origination processes can still require significant manual support, which is often inaccurate and time-consuming. This National Mortgage News paper, sponsored by Fiserv, explains how you can improve your current loan production while reducing costs and risk of non-compliance.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid
    
Fiserv

Gartner: Maximize the Business Value of your Data Lake with a Smart Data Catalog

Published By: Waterline Data & Research Partners     Published Date: Nov 07, 2016
For many years, traditional businesses have had a systematic set of processes and practices for deploying, operating and disposing of tangible assets and some forms of intangible asset. Through significant growth in our inquiry discussions with clients, and in observing increased attention from industry regulators, Gartner now sees the recognition that information is an asset becoming increasingly pervasive. At the same time, CDOs and other data and analytics leaders must take into account both internally generated datasets and exogenous sources, such as data from partners, open data and content from data brokers and analytics marketplaces, as they come to terms with the ever-increasing quantity and complexity of information assets. This task is clearly impossible if the organization lacks a clear view of what data is available, how to access it, its fitness for purpose in the contexts in which it is needed, and who is responsible for it.
Tags : 
    
Waterline Data & Research Partners

10 Reasons to Deploy Intel® Optane™ Technology in the Data Center

Published By: Intel     Published Date: Sep 27, 2019
As the first major memory and storage breakthrough in 25 years, Intel Optane technology combines industry-leading low latency, high endurance, QoS, and high throughput that allows the creation of solutions to remove data bottlenecks, and unleash CPU utilization. With Intel Optane technology, data centers can deploy bigger and more affordable datasets to gain new insights from large memory pools. Here are just ten way Intel Optane technology can make a difference to your business. To find out more download this whitepaper today.
Tags : 
    
Intel

Six Steps to Faster Data Blending for Tableau

Published By: Alteryx, Inc.     Published Date: Sep 07, 2017
To learn how to get your Tableau datasets faster, download the How To Guide “6 Steps to Faster Data Blending for Tableau.”
Tags : 
    
Alteryx, Inc.

An ERP Guide to Driving Efficiency

Published By: Sage     Published Date: Jul 08, 2015
This white paper describes how ERP technology can improve efficiency by: • Standardizing and automating business processes—locally as well as across multiple locations and countries—to accelerate business operations. • Offering a fully integrated suite of business management applications that share a common dataset and extending these applications over the Internet, allowing visibility and collaboration across departments, as well as with customers, partners, suppliers, and remote users. • Providing flexible and customizable reporting to improve business reporting, analysis,and insight.
Tags : 
enterprise resource planning, erp, efficiency, operating costs, standardization, automation, business management
    
Sage

Expert Tips For Maximizing The Power of Amazon Redshift Spectrum

Published By: AWS     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
AWS

Amazon Redshift Spectrum: expert tips for maximizing the power of Spectrum

Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services

Vertica Analytic DBMS - OvumButler Technology Audit

Published By: Vertica     Published Date: Feb 23, 2010
Ovum takes a deep-dive technology audit of Vertica's Analytic Database that is designed specifically for storing and querying large datasets.
Tags : 
ovum, vertica, analytical databases, dbms, technology audit, mpp, rdbms, grid computing
    
Vertica

Make your projects more awesome with Z.

Published By: HP     Published Date: Jan 16, 2015
Register below to gain exclusive access to the HP-NVIDIA® Autodesk Building Design Suite 2015 Graphics Optimization guide to help you get the most out of your workstation. Upon submission of your personal information, an HP representative will be in contact in regards to your interests and needs.
Tags : 
visualization, bim, business information, workstations, building design, viewport, autodesk, cloud datasets
    
HP

Make your projects more awesome with Z.

Published By: HP     Published Date: Feb 11, 2015
Register below to gain exclusive access to the BIM tutorial video, where Autodesk’s Lynn Allen gives pointers on maximizing your performance and productivity with key features in BIM applications – ultimately, helping you get the most out of your workstation. Upon submission of your personal information, an HP representative will be in contact in regards to your interests and needs.
Tags : 
visualization, bim, business information, workstations, building design, viewport, autodesk, cloud datasets
    
HP

Toxic Employees in the Workplace: Hidden Costs and How to Spot Them

Published By: Cornerstone OnDemand     Published Date: May 15, 2015
Leveraging econometric analysis of a dataset of approximately 63,000 hired employees spanning approximately 250,000 observations, this report looks not only at the measurable costs of toxic behavior such as sexual harassment, theft and fraud, but also other, equally damaging and harder-to-measure costs. The report examines these indirect costs closely, looking particularly at the toll toxic employees take on co-workers, and concludes that these costs create an even larger financial burden on businesses than the direct impact of an employee’s misbehavior
Tags : 
hidden costs, sexual harassment, toxic employees, employee behavior
    
Cornerstone OnDemand

Seeing the Voice of the Customer: Identifying Root Cause with Text Analysis Visualization

Published By: SAS     Published Date: Mar 14, 2014
Stop to think about how - and how often - your business interacts with customers. Most organizations believe that only a small fraction of data on interactions generated are effectively put to use. Why is that? Check out this whitepaper to see.
Tags : 
sas, voc, voice of customer, visual text analytics, best practices, customer voice, sound of sentiment, text data
    
SAS

Moving from Enterprise search to cognitive exploration

Published By: IBM     Published Date: Feb 26, 2016
With Watson Explorer, you can keep enterprise search as the foundation and transform search into Cognitive Exploration. Leveraging technological advances such as deep search and exploration, advanced content analytics, and cognitive capabilities, IBM Watson Explorer provides a unified view of the information you need, combining data from multiple internal silos and a variety of outside datasets including social media. Stop limiting your search to traditional data sources in the new, non-traditional data world.
Tags : 
watson explorer, ibm, deep search, content analytics, enterprise software, application integration, application performance management, business activity monitoring
    
IBM
Previous   1 2    Next    
Search      

Add Research

Get your company's research in the hands of targeted business professionals.