The Data Stack

11 Posts authored by: Rob Shiveley

For the first time, Red Hat Summit, the premier Open Source technology showcase, will take place on the West Coast, at San Francisco’s Moscone Center from April 14-17. It’s exciting to gather with the Open Source community of system administrators and enterprise architects to hear about the latest achievements in the Open Source tech universe.

 

Intel and Red Hat are key pioneers in bringing Open Source Linux to the enterprise, providing an established foundation for mission-critical workloads and cloud deployments based on the Red Hat Enterprise Linux* operating system and Intel® Architecture.

 

Doug Fisher, VP and general manager for Intel’s Software and Services Group in his keynote address on April 15, 9:30-10am will lay out Intel’s vision for a Software Defined Infrastructure, which encompasses compute, network, and storage domains and leads the way toward more agile and cost-effective data center architectures. Doug’s keynote will also focus on how Red Hat and Intel are addressing the architecture challenges posed by the explosive growth in data and connected devices.

 

Intel is also sponsoring a number of sessions and will participate in several panel discussions. Here are a few highlights – you won’t want to miss them:

 

  • Tuesday, April 15, 1:20-2:20pm: Empowering Enterprise IT. What’s next in data center efficiency and agility? Join Jonathan Donaldson, general manager of Intel’s Software Defined Infrastructure Group, for a discussion of how Intel and Red Hat are uniquely poised to take advantage of the changing face of IT infrastructure.

 

 

 

 

It will be a great show, to be sure. Please share your thoughts about Red Hat Summit and Intel!

 

For the latest on data center optimization, follow me at @RobShiveley.

Seismic shifts can happen in nature – and every once in a while, they happen in technology.  Examples include personal computing, mobile computing, the Internet, and more recently cloud computing and Big Data.

 

In-memory computing, one of the latest significant technology advancements, is the alignment of new, complementary technologies that can revolutionize how businesses use and manage data and redefine competitive landscapes in many industries. The move to in-memory computing is not an alternative to traditional solutions, it will become the way computing is done. When and how companies move to in-memory computing is becoming a critical business decision.

 

In-memory computing represents the synthesis of multiple technology innovations that work together to deliver real-time, data-driven business intelligence to the enterprise. These include:

 

Keeping memory local


Analyzing vast data sets for business insights is beyond the bounds of traditional database and analytics solutions that rely on mechanical data storage and access. However, the availability of affordable memory modules has hastened the development of in-memory computing models in which all relevant data is maintained in the main memory of the computing system, rather than in remote bulk storage systems.  In-memory systems can render data for analysis as soon as it is generated, with data-driven business insights available almost instantly.

 

Moving beyond relational databases.


Several leading database providers have re-architected their legacy database technologies to better meet the challenges of big data. New database solutions combine the features of traditional row store processing with columnar data processing and in-memory columnar compression, a set of technologies that vastly speeds the scanning of immense data sets and analytical querying. It all adds up to speed-of-thought analytics and more cost-effective memory and storage systems.

 

Harnessing Big Data.


Apache Hadoop* has become the de facto standard for storing and analyzing huge, unstructured data sets. The Intel® Distribution of Apache Hadoop* software (Intel® Distribution) is a software platform that provides distributed processing and data management for enterprise applications that analyze massive amounts of diverse data. The Intel Distribution provides enterprise–class management tools and data security as part of a cost-effective solution for ingesting, preparing, and storing warm data for inclusion in in-memory computing and real-time analytics environments.

 

Optimizing server platforms for in-memory computing.


The Intel® Xeon® processor E7 v2 family powers a new generation of servers that are specifically optimized for in-memory computing, delivering up to 2x higher performance than previous generation servers[1] and providing up to 3x higher memory scalability[2]. These new server platforms are ideal for the data-intensive, mission-critical demands of in-memory computing solutions.

 

Supporting these ground-breaking compute solutions at enterprise scale requires optimized servers with massive memory capacity, parallel execution resources, high system bandwidth, and advanced built-in reliability. To address the heavyweight demands of in-memory computing, the Intel® Xeon® processor E7 v2 product family unlocks the potential of big data and delivers large scale in-memory analytics for data-driven decision making and improved competitive advantage. Learn more about computing power of the Intel Xeon processor E7 v2 family.

 

Get the details about in-memory computing and on-demand business intelligence in the white paper "Changing the Way Businesses Compute…and Compete.”

 

 

[1]Source: Intel internal measurements November 2013. Configurations: Baseline 1.0x: Intel® E7505 Chipset using four Intel® Xeon® processors E7-4870 (4P/10C/20T, 2.4GHz) with 256GB DDR3-1066 memory scoring 110,061 queries per hour. Source: Intel Technical Report #1347. New Generation 2x: Intel® C606J Chipset using four Intel® Xeon® processors E7-4890 v2 (4P/15C/30T, 2.8GHz) with 512GB DDR3-1333 (running 2:1 VMSE) memory scoring 218,406 queries per hour. Source: Intel Technical Report #1347.

 

[2] On a 4-socket natively-connected platform: Intel® Xeon® processor E7 family supports 64DIMMS, max memory per DIMM of 32 GB RDIMM; Intel® Xeon® processor E7 v2 family supports 96DIMMs, max memory per DIMM of 64 GB RDIMM. This enables a 3x increase in memory.

SAP TechEd is SAP’s premier technical education conference, where IT architects, administrators and developers in the SAP community convene to enhance their skills, receive training, and learn the latest about new developments within the larger ecosystem of SAP and its technology partners. Intel has worked with SAP for over a decade, and we’re blazing the trail in a number of realms, including co-engineered big data infrastructures and faster, more powerful processors that support the next generation of in-memory data platforms.

 

To learn the latest about SAP and Intel collaborations, come hear Diane Bryant, Intel Vice President and Chief Information Officer (CIO), who will make a virtual appearance during the keynote address by SAP Executive Board member Dr. Vishal Sikka at 8:00 am on Tuesday Sept 22.   Diane’s overview of joint innovations between Intel and SAP will include the integration of the Intel® Distribution for Apache Hadoop* with the SAP HANA*,  which creates an end-to-end analytic framework for faster, more powerful big data insights. She will also describe how the new Intel® Xeon® processor E7 family works with SAP HANA to increase reliability and deliver up to 3X the memory capacity over the previous generation of processors.

 

Intel experts will also be on tap to lead tech sessions, where you can learn more about using Intel technologies within the SAP solutions environment. Check out the SAP TechEd Session Catalog for a full list of events, but don’t miss these:

 

 

We are also presenting a number of demos at the Intel booth. Stop by from 10am to 6pm to witness these technologies in action:

 

  • Integration of Intel Distribution for Apache Hadoop and SAP HANA. See how the jointly engineered solution handles automated query federation and data collection from diversified environments.
  • Integration of Intel Distribution for Apache Hadoop and SAP HANA in the Cloud. Experience the Intel and SAP Big Data solution as a cloud-based service.
  • Virtualizing SAP HANA and Traditional SAP Application Landscapes. See how IBM technologies offer new capabilities for virtualizing SAP HANA and integrating management capabilities to consolidate and scale SAP application deployments. 
  • SAP HANA Enhanced Security. Discover how McAfee Security Technology together with SAP Security Policy can help ensure a more secure environment for user data in SAP HANA.
  • Fujitsu SAP HANA Appliance Using Intel SSD DC 3700 for Data Persistency. Come meet the first SAP HANA-based appliance to use Intel Solid State Drives with RAID 5.

 

I hope to see you at SAP TechEd, where you can find out more about the Intel Distribution for Apache Hadoop and  how the long history of co-innovation between Intel and SAP can help you deliver faster insights and better business value to your customers.

The Intel/Red Hat relationship spans more than 10 years and is a great example of how these two companies are helping transform the industry. Intel and Red Hat’s engineering collaboration relationship has positioned them as leaders in the open source software ecosystem.

 

Jim Totton, Vice President and General Manager of the Platform Business Unit at Red Hat Inc., talks about our collaboration over the years in this video, where he discusses how the combination of Intel servers and Red Hat enterprise Linux software is enabling the next generation of agile data centers—physical, virtual and cloud. 

 

The Intel® Xeon® Processor E5 2600 v2 Product Family is the next important innovation solution Intel is bringing to the market; with improved memory architecture, caching advancements, increased memory bandwidth and lower power consumption. What makes the Intel Xeon Processor E5 2600 v2 Family so compelling is that the processor can automatically move into lower power states, depending on workload requirements, enabling an agile data center with optimized performance and improved energy efficiency.

 

Jim talks about those benefits, and illustrates how Red Hat’s enterprise operating system takes advantage of the hardware innovations found in Intel® Xeon® processors, which ultimately translates into value for organizations and their data center managers.

 

Check out Jim’s video! And visit http://intel.ly/11A5aF2 to learn about the latest technology advancements in Intel® architecture and Intel® technologies.

It’s no secret the world is becoming more and more mobile.  As a result, the pressure to support billions of devices and users is changing the very composition of datacenters, which is why Intel is introducing a portfolio of datacenter products and technologies for cloud service providers that handle a diverse set of lightweight workloads in the microserver, cold storage and entry networking segments.

 

Our goal at Intel is to provide key innovations original equipment manufacturers (OEMs), telecommunications equipment makers and cloud service providers can use to build the datacenters of the future.

 

It’s leadership in silicon and system-on-chip (SoC) design—rack architecture and software enabling—that allows us to create the new Intel® Atom™ Processor C2000 Product Family.

 

Intel® Atom™ Processor C2000 Product Family


The portfolio includes the second-generation, 64-bit Intel Atom C2000 product family of SoC designs. These new SoCs are Intel’s first products based on the Silvermont micro-architecture and include 13 customized configurations. 


 

New 64-bit, System-on-Chip Family for the Datacenter


We also introduced a new silicon—the Intel® Ethernet Switch FM5224—which, when combined with the Wind River Open Network Software suite, brings Software Defined Networking (SDN) solutions to servers for improved density and lower power. 

 

Switches based on the Intel Ethernet Switch FM5224 silicon can connect up to 64 microservers, providing 30 percent higher node density, 2.5 times higher bandwidth and two times lower latency.

 

First Live Demo of a RSA-Based System


In addition to the silicon and system announcements Intel made today, we are showing the first operational Intel Rack Scale Architecture (RSA)-based rack with Intel® Silicon Photonics Technology and MXC connector and ClearCurve optical fiber developed in partnership with Corning.

 

The RSA-based rack enables more data density and speeds of up to 1.6 terabits per second at distances up to 300 meters.

 

For more information on the announcements—including Diane Bryant’s presentation, additional documents and pictures, please check out Intel’s newsroom.

Intel and SAP have worked closely together for years. But, our co-engineering has reached a new level of integration with the recent launch of our Big Data solution based on the SAP HANA® platform and Intel® Distribution for Apache™ Hadoop® software. The joint solution will be demo’d at SAP Sapphire 2013, SAP’s flagship conference in Orlando, held May 14 – 16.

 

Stop by the Intel booth #3215 and experience our Big Data solution.

 

The Intel Distribution of Apache Hadoop running on SAP HANA represents more than a typical collaboration. It’s a great example of how co-engineering can result in a “more-than-the-sum-of-its parts” success that changes the game by vastly improving the speed and resilience of Big Data analytics.

 

HANA is SAP’s blindingly fast database platform that consolidates transactional and analytical workloads into a single, in-memory process. Combining OLAP and OLTP structures into a unified landscape eliminates traditional relational limitations that have restricted the development of real-time business applications, and in particular Big Data analytics.

 

Apache Hadoop is the industry’s open source standard for managing Big Data, but to ensure that Hadoop’s data-intensive workflows can provide real-time analytics on an enterprise scale requires a comprehensive computing platform with muscle and intelligence.

 

The Intel Distribution for Apache Hadoop is specifically optimized for the advanced features of SAP HANA and the Intel® Xeon® processor E7 family. Intel’s version of Hadoop can leverage SAP HANA in-memory technologies to accelerate data analytics and also tap into extra performance and scalability that the Intel processor E7 family is optimized for. In turn, SAP HANA’s in-memory processes help eliminate the latencies found in Hadoop’s underlying file system to enable on-demand data analysis.

 

Through joint-engineering, SAP and Intel have delivered a breakthrough Big Data solution that can store and analyze massive volumes of structured and unstructured data in real time. The underlying platform has the performance to scale to continued exponential data growth and deliver rapid-fire insights to help boost business productivity and profits.

 

For more information go to: http://hadoop.intel.com/resources

Intel has long blazed a trail of innovation for data center computing, leading the transition from mainframes to x86 towers to rack-mount blade servers, and beyond.

 

Today, Intel again takes the lead with the introduction of the Intel® Atom™ S1200 processor family, the industry’s first sub-10 watt server system on-chip (SoC) that builds in enterprise-ready features such as 64-bit support, virtualization technologies, and error- code correction (ECC) support for higher reliability.  The industrial-strength Intel Atom S1200 microprocessor is designed to power high-density microservers as well as a new generation of storage and communication equipment.

 

So why does the world need a SoC microserver?

 

It turns out that one size of server does not fit all needs in the enterprise data center. As the server industry continues to segment, Intel recognized the need for high-density, hyper-scale servers based on low-power processors that can deliver extremely energy-efficient performance within a highly dense-compute footprint. These characteristics are increasingly important for many data center workloads, but address the immediate compute needs of companies that offer dedicated hosting and private clouds, Big Data workloads, content delivery, and front-end servers for hosting web pages; yet still need to harness the horsepower of 64-bit computing.

 

Servers based Intel Atom S1200 processors also give data center managers a new tool for workflow management, allowing them to flex and scale hardware configurations to meet the needs of changing workloads. Because the new server is compatible with the most commonly used server OSs, applications and data center hardware, implementing the new server is straightforward, with no need for porting or tuning new software stacks. You can break up large and complex workloads, such as Hadoop algorithms, and run many small but highly parallel "chunks" of code for optimal efficiency and performance across a range of server nodes. And there’s no need to rewrite code for the new high-density server, because all the code that’s already running in your x86 datacenter will also operate on the Intel Atom S1200 processor-based platforms.

 

With more than twenty low-power server, networking and storage systems based on Intel Atom S1200 now in production, the processors provide a new milestone for optimizing performance with low-power, high density computing. Download today’s press release announcing the Intel Atom S1200 and learn more about Intel’s roadmap for power efficient, high-density computing.

Like it or not, consumerization is a fact of life. Your company’s employees are organizing their lives with smartphones, tablets and mobile PCs. And they expect those devices to help them at work too. If you don’t take steps to support them, you risk that they will take things into their own hands, including actions that put the security of your company’s data in jeopardy.

Luckily, the pieces are falling into place to help you take advantage of the employee productivity and satisfaction inherent in the Bring Your Own Device (BYOD) model.


Consumerization stretches IT along several fronts: integration, security, and user experience. Intel and Microsoft are collaborating on underlying technologies to help you optimize all three. In particular, they are improving access to the enterprise-driving SAP applications that really engage mobile users.


  • The companies are working together to implement hardware-based support for SAP middleware, including the SAP Afaria mobile management solution.
  • They are making it much easier to employ Intel-based platforms for popular SAP mobile apps, such as SAP Customer Financial Fact Sheet and SAP Interview Assistant. You’ll find these available by the end of the year.


These initiatives utilize hardware-enhanced security to make it easier to protect the BYOD enterprise. For example, your enterprise can already use Intel® Identity Protection Technology for fraud deterrence, Intel® Anti-Theft Technology to secure data and assets, and Intel® AES-NI encryption. As you see this technology rolling out in Intel-based smartphones, tablets and Ultrabooks™ – as well as traditional laptops – you’ll know how to enable each device as part of a more secure, integrated BYOD enterprise.

When your goal is delivering real-time analytics on a massive scale, it pays to work together – just ask SAP.

 

SAP and Intel have worked together for more than a decade, and the results of their joint engineering efforts have delivered game-changing levels of performance and scalability for enterprise application requirements.

 

SAP and Intel co-engineering has reached a new level of integration with SAP HANA*, an in-memory database optimized for big data analytics. SAP HANA delivers significant improvements – up to 7,976x better performance over previous systems across all project implementations– to help customers make faster, smarter decisions and improve productivity and profitability.  

 

SAP HANA is optimized specifically for the advanced mission-critical features of the Intel® Xeon® processor E7 family, providing real-time analytics on an enterprise-scale. Performance tests on a 16-node cluster of servers demonstrated sub-second response times for thousands of simultaneous ad hoc queries acting on 100 billion rows of sales and distribution data.

 

SAP HANA running on Intel’s Xeon E7 family-based servers doesn’t just accelerate analytics. It also simplifies the underlying database infrastructure and operational framework for greater efficiency, helping eliminate the complexity and delays of traditional data warehouse loading processes.

 

The combined SAP-Intel solution is available in pre-configured appliances from leading hardware solution vendors; they integrate quickly with SAP applications to simplify and accelerate data access and analytics.

 

But the news—and the collaboration—doesn’t end there.


At SAP TechEd 2012, held Oct. 16 –19 in Las Vegas, SAP announced further advancements to SAP HANA,  unveiling details for a new cloud-based application development platform powered by the SAP HANA in-memory database, and also announcing availability of SAP HANA on Amazon Web Services, enabling the deployment of flexible cloud services in an SAP environment. SAP also cited SAP HANA momentum in the marketplace. A hundred start-ups have joined a development program for SAP HANA-related software, and over 600 enterprise clients are now using the SAP HANA platform.

 

And the SAP-Intel model for collaboration keeps evolving. Intel and SAP engineering teams have deployed SAP HANA in a new Petabyte Cloud Lab that provides 8000 threads, 4000 cores, and a whopping 100 TB of RAM in a server farm consisting of 100 four-socket Intel Xeon processor E7 family-based servers.  In this deployment, the cluster capably handles a single instance of SAP HANA across petabyte of data with near-linear scalability.  The jointly funded lab provides Intel and SAP engineering teams with a large-scale research and development environment for further co-optimizing products and technologies.

 

Read our white paper Scaling Real-time Analytics across the Enterprise - and into the Cloud to learn more about how Intel and SAP are working together to speed access to information and insights to bring greater success and value to our customers.

Several server system vendors are rolling out new Itanium systems that incorporate the latest Intel Itanium processor family member; the Intel Itanium Processor 9300 Series. I'd like to congratulate HP on the recent launch of its completely revamped Integrity server system line-up. HP in late April announced the most extensive re-tuning to date of their Itanium-based Integrity product family. There's been a great deal of interest about HP's new products from industry press, analysts and IT customers.

 

The new HP Integrity systems are built on HP's BladeSystem Infrastructure, with a new Blade Scale Architecture that simplifies the deployment and management of a number of blades in one enclosure. Even the highest performing HP Integrity server system, Superdome, is taking advantage of HP's new converged infrastructure, moving from a tower configuration to blades that plug into a blade chassis. Now customers will be able to take advantage of the legendary scalability and reliability of Itanium-powered HP Superdome 2 systems in a more flexible and manageable blade form factor. Now that's real innovation and value.

 

 

Go to the HP Integrity website for more information on HP's new Itanium Processor 9300-based Integrity server systems.

 

Go to the Intel Itanium Products website for more information on the Intel Itanium Processor 9300 Series.

 

Go to the Itanium Solutions Alliance website for the latest up-to-date Itanium solutions news.

HP is holding a global virtual event on April 27, 2010 that will introduce HP's vision and major announcements regarding their products and solutions for what HP is describing as the "next era of mission-critical computing".  The event will feature an overview session, some cool technology demos, and will include the presention of thought-provoking white papers.  Event participants will have the opportunity to share reactions to the announcement and ask questions through live expert chat sessions.

HP plans to share their perspective and some information about:

  • How to scale service-level agreements dynamically to meet business needs
  • Ways businesses and IT managers can capture untapped resources and react faster to new opportunities
  • Important methodologies and strategies to help reduce complexity in the IT environment
  • HP's recommendations about how to best lay the foundation for a converged infrastructure to accelerate business outcomes

As a partner to HP, I plan to join in; it should be a worthwhile event!

Registration is available at: www.hp.com/go/witness

You can also follow updates from the event on twitter at #HPIntegrity

Filter Blog

By author:
By date:
By tag: