IT Peer NetworkWhere is this place located?

Log in to follow, share, and participate in this community. Not a member? Join Now!


Community Spotlight: Data Center

20 Questions on SSD: How consistent is the performance of your SSD?

In this blog, we’ll look at a question that’s critical to performance in RAID sets and planning in your data center, “How consistent is the performance of your SSD?”

Hadoop Tutorials: Ingesting XML in Hive using XPath

In the first of my series of Hadoop tutorials, I wanted to share an interesting case that arose when I was experiencing poor performance trying to do queries and computations on a set of XML Data.

Microsecond Responsiveness with Exchange and Intel SSD

In this blog we show the potential of using direct attached Intel SSD's with JetStress 2013 as the driver for key storage characteristics.

Share your perspective

How are you addressing your data center’s energy usage?


Special Edition CXOTalk Google Hangout

Intel IT Strategy: Darcy Ortiz, Aziz Safa, David Aires

The question of IT leadership is fraught with challenge. In a world where IT's relationship to the business is being called into question, the CIO has a special responsibility to drive a transformational agenda.


In this special session of CxOTalk, sponsored by Intel, we speak with three of the company's top IT leaders. This episode explores Intel's multi-year plan to providing transformational value to the company as a whole.


Intel's IT Leaders include:

  • Darcy Ortiz, General Manager, CIO Strategy Office Information Technology, Intel
  • Aziz Safa, Vice President, General Manager, Enterprise Apps & App Strategy Information Technology, Intel
  • David Aires, Vice President, General Manager, IT Operations Information Technology, Intel

Recent Blog Posts

Refresh this widget

Happy Monday.  As I was driving in this morning to the plant I was thinking about my first laptop back in the 90's. I had just migrated from a 486 tower that my local IT team built for me to a new laptop, in which the modem was external, screen was tiny.  However, what I remember the most is that my choice was one laptop (period), there was no choice in brand, hardware config, options, etc. I remember being so excited to jump on a plane with my laptop and be mobile, but yet I was still wanting more speed, capability, lighter...


On this Migration Monday, I'd like to give up the latest in our device comparison guide.



It's Monday morning, it's a great time to think about moving off of XP to a newer OS.  This move may even mean taking an exciting leap to a tablet or other mobile ready device.


Josh H


Read more of Josh's Migration Monday blogs

A while ago I found a blog post titled, ”What is Hindering Cloud Computing Uptake?” written by Walter Bailey of I was interested to see what had he come up with to describe why the uptake remains low. Bailey states that cloud computing uptake remains low because people are not enthusiastic about moving their business into the cloud. Additionally, he adds five other possible reasons:


  1. IT functionalities (control, security and privacy)
  2. Infrastructure problems (systems are old and do not function well in cloud environment)
  3. Talent shortage
  4. Budget and cost implications
  5. Pooling of resources


His arguments reminded me of an experience I had a couple years ago. I was visiting the CIO of a large organization and told him about the importance of cloud computing to allow him to respond quickly to the needs of the business. He laughed, telling me it typically took the business two years to make their mind up, and so there was no reason for him to respond fast. Remaining polite, I did not argue. I was just asking me when he was made aware of the needs and whether the perception of the business people were similar. I also asked myself how much “shadow-IT” was being used. I bed there was a lot.  I called his approach resistance to change. He will most probably close the limelight on traditional IT. Obviously I never heard back from that account.


The reasons identified above will play a role, but with the exception of the first, I don’t believe they are the main reasons. There is a key reason that is missing. Let me explain.


Earlier this week, I was talking to an account team. They were describing their customers’ IT objectives. On the operations front, one objective immediately caught my attention. They wanted to reduce the provisioning of a server from 30 to 60 days down to one day.


Organizational issues

So, I asked the question, “Why it does it take between 30 and 60 days?” They diligently explained to me their process. First, the server needs to be installed in the datacenter. That is initiated by the server team, but performed by the facilities guys. So, a couple days are lost to get the approval and do the work. Then the cables have to be connected so the server can be made operational. That’s another couple days. Then, before the server is recognized in the environment, it needs to be entered in the asset management system. That roughly takes a week. You then need an IP address attached to the server; this requires the intervention of the networking team. And those guys have a heck of a lot of work these days, so count two weeks before it’s done, and then…By now you got the picture.


If we could calculate the actual amount of time spent working on the server, we would be well under 1 day, but by the nature of the organization and its processes, that one day turns into 30 to 60 days. 99% of the time the server is waiting. You know, that reminds me of my manufacturing days. Toyota and others have developed Kanban to address this. So, could they do the same? Not sure, as the issue here is the siloed structure of their organization.


Traditional IT organizations are focused on specific technologies. You have the server guys, the storage people, the network team, the database administrators, etc. Automation can help, but is often resisted because there is a feeling many operators will lose their jobs if processes are automated.


About a year ago, I ran a cloud discussion with a customer IT department, and it completely failed. It’s actually the only time this has happened to me, so it’s not readily forgettable. I should have known right from the moment the customer IT department presented themselves that I was set for failure. I had the responsible for small windows server, the one for large windows servers, the one for Linux servers, the one for NAS storage, the one for SAN storage, the responsible for firewalls, the head of security, the data center network guy, the WAN guy etc.


My presentation covered all aspects of cloud including security, compliance and risk management. As I always do, at the end of the presentation, I asked them if I had addressed their expectations. And a nearly unanimous answer came back: “You have only spent 5 minutes on my subject.”


Transform the IT organization

Cloud computing cuts across all these silos and requires collaboration between the different subject matter experts. Organizations have “anti-bodies” rejecting ideas and people that change the way things are done. It’s often called “resistance to change,” and is deeply ingrained in many teams. It’s often related to the fear of losing control—of having to transform.


Cloud Computing, as soon as you get beyond IaaS, is a game changer, so no wonder people are nervous. It forces IT professionals to transform their job. And that is often painful. Unfortunately, these days, change is the only constant and that applies for all of us, including IT professionals.


What makes it more complex though, is the fact IT needs to transition between two models—as IT departments will run their traditional and cloud environments in parallel for the forcible future. The teams focused on the legacy applications can continue operating the way they have been, while the teams delivering the new applications and services need to adapt to the new environment. So, IT organizations will have to combine two approaches, and that is complex and difficult.


If we agree that the hybrid cloud is the way to go—that IT becomes the strategic broker of services to the business—four key functional teams are required:


  1. A Cloud Platform team operating and managing the environment required to deliver to the business users, in a transparent way, internally developed services, as well as services sourced from external sources. This platform includes an integrated portal, a service catalog, as well as the environment required to deliver and intermediate/aggregate services, as I described in a previous blog entry.
  2. A service development team, developing the “core” services that the company decided need to be kept within the company.
  3. A sourcing team, focused on the sourcing and intermediation/aggregation of all “context” services the enterprise sources from external sources
  4. A services governance team, reviewing with the business the services that need to be provided, managing with them the service lifecycle and identifying the core and context services.


Build a unique user experience

Users want specific services, and if IT does not deliver them, the users will go around IT and find what they need. That’s what is called “shadow-IT.” The issue is that users do not always realize the implications of the choices they make. It’s up to IT to deliver to them the services they require through either developing or sourcing the appropriate services. The user is most often not interested in where the service comes from; he/she just want to use it. The easier this is done, the better. So, IT should build a unique portal through which the user can use all the services required to do his/her job. The complexity of accessing the service should be shielded from the end-user. This is precisely what we try to achieve with converged cloud, develop a unique user experience. Aggregation and intermediation of services from external sources should be transparent. The benefit for IT is that the sources can move without having to migrate users.



Yes, control, security, compliance (e.g. privacy) are inhibitors to the use of public cloud. But if we take cloud in its widest sense (both private and public), the biggest inhibitor of all, for enterprises, is the way the IT department is organized. Many CIOs realize a change is needed, but they have to battle against very strong resistance to change. The fact the traditional and cloud environments will have to live together for the forcible future makes their job more difficult. The most advanced understand this and act accordingly. They will have a bright future as they will be increasingly integrated with the business. Many are in a wait and see mode, and some just deny. Where are you on this journey? How are you addressing the integrated world of traditional and cloud environments?


Christian Verstraete is the Chief Technologist Cloud at HP and has over 30 years in the industry, working with customers all over the world, linking business and technology.


Follow Christian's blog

Managing the Changing IT Landscape: IT Leadership


Today’s IT leaders can do anything, but not everything. Juggling rapidly evolving technologies, a range of new mobile devices, and a changing security landscape is no small endeavor. To drive maximum business value, IT has a responsibility to leverage the latest technology innovations, yet still maintain basic management processes to keep things running efficiently.


The modern CIO

changing role of IT.pngDion Hinchcliffe wrote a great article for ZDNet* earlier this year, explaining how the role of IT has shifted.


Today, the CIO’s challenge is to meet the needs of the entire business. When these needs aren’t met fast enough, IT services are procured and implemented by LOB leaders, departments, or individuals who need to enhance productivity or meet a customer need.


A new take on old ways

As I ponder the increasing complexity of the IT leader’s job, I see how challenging it is to keep the organization running smoothly while trying to adapt to the next request for systems and tools to enable growth. I recently read a ComputerWeekly blog by Tim Wasserman where he explores managing the tension between IT agility and discipline. Tim’s key point is that the two aren’t mutually exclusive. To be agile, an organization must still maintain discipline, but it requires a new—and balanced—approach.


“The need to integrate agility and discipline requires a whole new set of skills for business and project leaders, including an improvement in organizational execution capability as well as a reconsideration of organizational structure, culture, and change management,” Wasserman explains.

Six principles of “disciplined agility”

The key question is how? The blog cites six leadership principles that can help achieve a balance between discipline and agility. Here’s a quick summary:


  • Transparency and visibility
  • Adopt the customer’s perspective
  • Collaborate more
  • Demonstrate successes
  • Continue to learn
  • Embrace change


More change on the horizon

I’m a firm believer in that last principle: embracing change by planning for it. It’s inevitable that more change is coming to the enterprise, whether it takes the form of wearables at work or a new cloud-based technology, or something totally unexpected. By responding to these innovations with a balance of agility, strategy, and discipline, IT can realize greater success for the business.


Intel CIO Kim Stevenson talked recently about how Intel IT is becoming more agile by focusing on what is called the SMAC Stack (Social, Mobile, Analytics, and Cloud) to enable new customer experiences and create a scalable, cost-efficient way to deliver applications.


How is your organization managing the rapid changes of today’s enterprise environment? And how are you achieving a balance between agility and discipline?


Chris Peters

Find Chris on
Start a conversation with Chris on
previous content from Chris.

#ITCenter #ITLeadership

IT departments have likely seen an increase in employee requests for tablets or mobile devices as the consumerization of IT continues to shape and change the corporate landscape. Whether staff need network access under a BYOD scenario or are being issued company-owned devices, cost and performance are two important factors when determining what devices to purchase.


A low-cost tablet can be an effective way to provide the mobility features employees want without having to make a substantial investment in new equipment. In some cases, staff can be outfitted with a 2 in 1 device that will serve their needs for productivity as a traditional laptop while giving them the touchscreen display of a tablet (see our TCO report Save with a 2 in 1 Ultrabook). To help with this decision process, Intel commissioned Principled Technologies to review three Android tablets to determine which would be the most cost effective for staff.


-IT Peer Network Administrator


Principled Technologies reviewed the Dell Venue 7 – powered by an Intel Atom processor, along with the Samsung Galaxy Tab 3 7.0 and the Google Nexus 7 – powered by Qualcomm Snapdragon processors – to see which had the best combination of web browsing experience and cost.


To start, the Dell Venue 7 was the least expensive device at $149, followed by the Samsung Galaxy Tab 3 7.0 at $169 and the Google Nexus 7 at $229. Next, Principled Technologies tested the browsing experience on popular websites like,,,, and others, and found that the Dell Venue 7 and Google Nexus 7 had the same relative experience. The Samsung Galaxy Tab, however, had several display issues, including choppy scrolling, blurry text, and slow load times.


Because manually testing websites can be subjective, Principled Technologies used their WebXPRT and Mobile XPRT technologies to mirror four HTML5 and JavaScript-based workloads: Photo Effect, Face Detect, Stocks Dashboard, and Offline Notes. As you can see from the tables below, the Google Nexus 7 had slightly better performance than the Dell Venue 7, but both out-performed the Samsung Galaxy Tab.


Screen Shot 2014-04-15 at 10.37.10 AM.png

Based on these tests, Principled Technologies concluded the Dell Venue 7 tablet was the best device when considering price. Though the Google Nexus 7 had equal storage and a better web browsing experience than the Dell Venue 7, at almost $80 less than the Nexus 7 the Dell Venue 7 was considered to be the better value.


Of course, the Dell Venue 7 isn’t the only Intel-based tablet on the market. Intel is committed to improving security on Android tablets and working with hardware partners on optimized 64-bit processors for the platform, so more robust devices are likely to be rolled out to market soon. Carefully choosing what mobile device is best for your organization requires planning, and often times a 2 in 1 device might be more appropriate for staff that travel on a regular basis, but hopefully this quick guide will help IT decision makers save time and money when selecting a tablet.


For a more in-depth review of the three tablets, check out Principled Technologies “A better Android tablet experience for your money.” Are you considering purchasing a tablet for your enterprise? If so, what features are necessary for your staff? Leave a comment below or join the social conversation on Twitter using the hashtags #ITCenter and #tablets.

20 Questions on SSD: Things you’ll want to ask


Question #5: How can I use Temp, Tier, and Cache + Intel SSD?



So it’s been a few weeks since my last blog and I want to switch gears away from talking about the Intel Data Center Family of SSDs features. We’ve gone over features like production qualification, endurance requirements, power-loss protection, and finally consistent performance. What I want to talk about in this blog is solutions. Specifically, today we have a number options to leverage SSDs without going to the extreme of replacing your hard drives (HDDs) 1-for-1 with SSDs for any particular application.

The solutions team I work in at Intel has spent the last year exploring the benefits of our SSDs in a wide array of environments from Big Data to Virtualization with many stops in-between. The interesting commonality in all of these explorations is that in many cases, the best ROI and TCO benefits come from using the SSD as temp, tier, or cache! There are a few use cases where 1-for-1 replacement of traditional HDDs pays off, but for the most part… repeat after me; temp, tier, and cache... think lions, and tigers, and bears, and a zoo!


In the ‘temp’ space, (wow, that pun was even a stretch for me) we’ve seen goodness in Hadoop with jobs that produce intermediate data by changing the “mapred.local.dir” over to an SSD and with relational databases by moving ‘TempDB’ to a local SSD. In tiering with virtualization, we demonstrated a number of solutions where we built low-cost 100% SSD software based SAN and moved VMs into these high-performance NFS or iSCSI datastores. Using SSDs as buffer/journal/cache, we’ve looked at software based scale-out storage solutions such as VMware’s VSAN, PernixData, Microsoft Storage Spaces, and open source options such as CEPH. In the pure cache space, we’ve also looked at several different caching software packages, including Intel CAS (Cache Acceleration Software), and how these packages can benefit Enterprise IT workloads.


So, there’s plenty of opportunity to leverage SSDs to accelerate your enterprise workloads. The question is, “will your workload benefit enough to yield ROI or a TCO improvement?” Which brings me to my second point, your workload really matters! In my IT past, I often handed off storage workloads to the SAN (Storage Area Network) team. When application performance issues arose; we looked at storage, determined if that was the cause, and requested more IOPS or throughput from the SAN. In contrast, most of the temp, tier, or cache implementations studied by this team focus on locally attached storage or DAS (direct attached storage). This being the case, the application engineer and the systems engineer must work hand-in-hand to look at the workload, details of the IO, and in the case of our testing… determine how to best employ SSDs as temp, tier, cache, or as a final option, complete conversion to SSDs (muahahaha)!


Let me illustrate with a brief walkthrough. Almost 2 years ago now, we published this paper on accelerating database workloads with Intel SSDs. In this paper we did a 1-for1 replacement of local 15k disk drives with SSDs. If we were to repeat this exercise today, we’d need to investigate how much ‘TempDB’ was used, assess whether or not a solution like Intel CAS could help, evaluate features like ‘Buffer Pool Extension’ in Microsoft SQL’14, and then finally look at wholesale replacement of the 15k data drives with SSD based on both the IOP and capacity requirements of the application.


The point here is there are now many more opportunities to realize the benefits inherent in SSDs while keeping capital spend to a minimum. A brief query on one of my favorite retail sites this month (April '14) tells me that a 240GB Intel SSD DC S3500 runs roughly the same price as one of the major OEMs certified 10k SAS drives at 300GB, about $250.00. So in $/GB there’s still a 1.25x difference in cost.


I’m looking forward to the day when by default SSDs are the first choice in local storage and they’re getting close. In fact, for uses like boot/swap the price is close enough to warrant having a RAID 1 SSD solution in-box and an extra Ace in the engineer’s sleeve. Until then, there are a lot of things I can do with temp, tier, and cache. The question for the reader is can you use one of these methods to help your applications and data center run faster while providing a good ROI or decrease in TCO? Could you do more with less and increase efficiency by leveraging SSDs?

- Chris


Christian Black is a Datacenter Solutions Architect covering the HPC and Big Data space within Intel’s Non-Volatile Memory Solutions Group. He comes from a 23 year career in Enterprise IT.

Follow Chris on Twitter at @RekhunSSDs.


#ITCenter #SSD

As the workforce becomes more mobile, many companies are considering tablet devices for departments like Marketing and Sales in order to make completing tasks easier while on the go. Though tablets can be beneficial in several circumstances, IT should first look to new 2 in 1 machines, instead of purchasing and managing both a laptop and a tablet for each employee.


-IT Peer Network Administrator


In order to determine potential savings, Prowess Consulting analyzed three-year costs for an organization that plans to purchase two devices for each staff member: a sub $1,000 laptop and either an Apple iPad Air or an Android (Samsung Galaxy Note 10.1) consumer tablet. In our scenario, both the laptop and the tablet would be managed, secured, and supported by IT, even in a BYOD scenario. Then, they compared those three-year costs against purchasing and managing an Intel vPro 2 in 1 HP EliteBook Revolve 810 G2 for each employee.


Though initial hardware costs might assumed to be higher for IT departments purchasing two devices instead of one, the HP EliteBook Revolve 810 G2 device cost about the same as the combined hardware cost of the laptop device and either consumer tablet. Similarly, the price of software across the different scenarios was considered to be roughly equal.

Graph 1.png

The cost advantages of the HP EliteBook Revolve 810 G2 became obvious when they added in the costs of the hardware support, replacements, and most important -- the costs to deploy, manage, secure, and support. (The savings would become even more dramatic if you factored in the costs associated with additional peripheral devices and accessories, and the time employees must spend managing files and content, however, these were omitted from the analysis.)


With both employer-purchased and BYOD devices, the organization and workers benefit with the Intel vPro 2 in 1 solution because such devices have the built-in remote manageability and security features of the 4th generation Intel Core vPro processor and enterprise-grade manageability features available with Windows 8.1 Pro.


For example, faster hardware-based data encryption can secure information without slowing down work, and two-factor authentication with Intel Identity Protection Technology (Intel IPT) can help prevent unauthorized access to devices and business data. Plus, IT can configure, diagnose, isolate, and repair an unresponsive infected PC using remote support and monitoring capabilities embedded into select Intel vPro processors.

The analysis shows that the associate costs of support, management, and security nearly doubled in the scenario of IT having to deploy two devices per employee instead of one. Simply put, the management of devices doesn’t scale.


Aside from lower IT management costs, users can enjoy a better work experience with the Intel vPro 2 in 1 because it can run full-featured Microsoft Office and other applications, which workers rely on, in tablet mode. iPad and Android tablets do not support these applications natively and offer less capable alternatives. Additionally, iPad and Android tablets do not match the browsing functionality or multi-tasking capabilities of Windows tablets, restricting Web-based productivity or the many tasks that require switching back and forth among multiple applications.


Therefore, instead of purchasing both a traditional laptop and a tablet device, a single HP EliteBook Revolve 810 G2 would be a less expensive and more secure alternative for the organization and provide a better laptop plus tablet experience for users.


To read the longer analysis comparing the two device scenarios, read the Total Cost of Ownership white paper and for more information on the total cost of ownership, check out the latest video and infographic that highlight how much a business can save when it switches to a 2 in 1 device.


Are you weighing new device options? We want to hear from you! Leave a comment below telling us the different factors that influence your decision-making. And join the social conversation on Twitter by using the hashtags: #ITCenter and #2in1

One of the challenging transitions teams need to make when they adopt ITIL Service Management is
thinking in terms of Services rather than Products.  Products are the means by which services are delivered. However, the Customer usually cares about the service, not the delivery system.


By definition, Service Management is about providing services to Customers who do not want to own the processes themselves.  Payroll is an example of a service that was once commonly provided within the finance department. As payroll legislation and processes became more complex, companies developed the capability to deliver payroll in their “computer department”.  Payroll became a service to finance. Once the service was defined, with requirements, costs and performance expectations, the Customer could then do comparison shopping, selecting the best service provider for the best price. Customers could contemplate whether the service would be “contracted” in house or outsourced to an external service provider like ADP. 


Defining services falls within the ITIL process groups Service Strategy and Service Design. Service Strategy articulates the services needed by the Customer and Service Design lays out the requirements for the service.  Defining services in the absence of the Service Strategy analysis is difficult for existing product-based organizations, but eventually leads to the same end state if all of ITIL Service Management is adopted.


Immediate benefits can be realized. Duplicate services are sometimes identified that had previously been disguised by product offerings.  Also, teams might be performing activities whose services are hard to define, and perhaps are not even needed. These discoveries  lead to great discussions between the Customer and the service providers when writing the contract known as the Service Level Agreement.


The Service Level Agreement is where Customer expectations are reconciled with the Customer’s budget. Costs are applied to Customer requirements, and the Customer then has the opportunity to choose the level of service that maximizes their profit. Instead of the Customer arbitrarily insisting on the best possible service, the Customer can think thru the service level that is truly needed, based on the actual costs to the business. With expectations clearly articulated, the service provider knows the expected service level targets, and can then demonstrate to the Customer in future meetings how well the expectations were met, and if not, what changes are required going forward.


One of the benefits of a service-based organization is the potential for innovation.   If the services can be distilled so that the true services are defined, the service owner can then contemplate alternative and possibly inventive solutions for delivering the service. The emergence of cloud computing is a great illustration of innovative solutions providing the same or better service but freed from the products from which the service was previously provided.


IT Service Management can result in innovation and a very pleased Customer – two outcomes not usually anticipated when companies embark on the service management journey.

This Malcolm Harkins blog originally appeared here. It is republished with permission.


End users, not technology, define the boundaries of the enterprise. Security strategies must protect this new perimeter.


You and I, engineers and software developers, every single employee in your company are part of the collective “we,” and as such each of us has a critical role to play in fortifying our security perimeter and preserving privacy. That perimeter surrounds our direct enterprise, our data, as well as the products, services, and solutions we deliver to our respective customers.


The world of computing is rapidly evolving. The traditional model where desktop computers were essentially fixed in place and security and privacy efforts were primarily focused on protecting the network perimeter is obsolete. So what replaces this model?


With corporate laptops and tablets, and employees bringing their own devices, the security perimeter has shifted, and it is now swayed by the choices of each employee. This shift brings tremendous benefits, but not without a significant potential for risk. In today’s fast moving environment the question becomes, have you, and every employee within your company, stayed abreast of what it takes to protect your security perimeter?


Are you prepared to secure your company’s perimeter as the Internet of Things (IoT) expands at an accelerating rate?


This chart from Daily Infographic*, prepared by*, shows the projected trajectory for the IoT, and underscores the need to ensure each and every one of us is prepared to embrace these advancements with proper security and privacy precautions.

global internet device forecast.png


To be clear, the Internet of Things should make our professional and personal lives easier; but it will not necessarily make them simpler – at least in the short term.


In conjunction with the incredible growth and opportunities ahead, we must identify early on the implications that emerging technologies will have on how data is collected, handled, stored, shared, managed, respected, and deleted.


When one careless choice can threaten an entire company’s security perimeter, what steps do each of us need to take?


The answer is as simple as it is daunting. Each of us needs to increase our knowledge of potential risks and apply it consistently as we make security choices.

It means fundamentally changing company culture. It is not enough to understand how technology works and connects; we must internalize the privacy and security implications of those tools and connections and create an environment that proactively and automatically takes the right actions.


It means educating others within our companies and communities about the opportunities and risks each person’s security choices entail, in a way that is easily understandable and compelling so people are both empowered and motivated to make the right choices.


It means watching out for each other so when one person’s actions open a potential security or privacy gap, another person is right there to help prevent the risk. It’s the “if you see something, say something” concept, applied to security and privacy.


It means recognizing that technical controls alone cannot protect us from rapidly changing attack structures or the complexity of new technologies. It’s time to step up to the expanded role and be part of the solution; we are all a vital part of the security and privacy perimeter.


* Other names and brands may be claimed as the property of others.

Managing the Changing IT Landscape: Big Data Analytics


There are multiple approaches to building data analytics models to solve a problem. leaderboard final.pngThe Intel-sponsored March Machine Learning Mania Kaggle competition has shown us that. The second stage of the contest inspired 440 entries—that’s 440 versions of a predictive analytics model —from the 251 analytics teams. My congratulations to the winner Grimp Whelken, a Kaggle novice, whose data model best predicted the winners of the NCAA men’s basketball tournament. Here are the final top 10 teams from the competition.


Lessons for Business from the Big Data Dance


Throughout this contest I’ve been interested in how elements of this predictive analytics contest can apply to business and IT. My previous two blogs in this series explored the skill shortage of data scientists and the importance of real time prediction. As the competition closes and I write my final blog, I noticed that the top 10 teams are almost all single players. But were they working alone?

In the competition, Kaggle players were supported by data derived from a sports ratings website and this information was pre-packaged and provided to the contestants by Kaggle.


In the real world, identifying the appropriate data, prepping it for use, and validating the model requires a close partnership between IT and the business. Business analysts with their domain expertise often provide a critical perspective on a big data problem and its relevant data. In the Kaggle contest Ken Massey, someone with recognized domain expertise, essentially served that role for each team.


leaderboard final 2.png


Small Teams, Big Potential


Intel follows a big data project model that empowers small teams to pursue initiatives that can be accomplished in six months and promise $10million in return on investment (ROI). Intel has successfully completed more than a dozen such projects, and is now pursuing higher value opportunities with $100 million ROI. These five person teams (much like a basketball team) tap a variety of skilled positions including IT, advanced analytics, and business experts.


Intel’s approach is validated by Tom Davenport, author of the recently published Big Data at Work. Davenport found that the large companies he interviewed for his research were forming teams of people with a range of skills rather than hiring PhD level data scientists on a large scale. The teams included people with quantitative, computational, and business expertise as well as skills in technology, and change management.

Teams bring together the range of skills needed to tackle big data projects—business, analytic, and technology. How does your organization partner with the business to deliver projects of high ROI?


Personally, I was very excited to see both UCONN Men and Women win an historic sweep of both NCAA basketball tournaments. Go Huskies! Looking forward to next year to watch the prediction engines in action again!


Chris Peters


Find Chris on LinkedIn.
Start a conversation with Chris on Twitter.
See previous content from Chris.

#ITCenter #BigData

Long-awaited 64-bit mobile computing is finally here! While 64-bit computing is nothing new, its effects on mobility will be significant. As Intel’s Julie Coppernoll described to PC World, the impacts 64-bit computing will have on mobile computing capabilities include “improvements in video performance, compression, decoding and other intensive computing tasks with a 64-bit chip and compatible OS. Mobile devices will also be able to pack more than 4GB of memory, which could provide performance improvements.”


64-bit capabilities allow for 64 bits of data to process at one time, rather than the traditional 32 bits – see a visual depiction of this processing increase in the infographic below:

Screen Shot 2014-04-07 at 11.22.18 AM.png

(View Full Infographic Here)


Learn more in A Map to 64-Bit Computing.


Creating a Culture of Innovation by Ed Goldman

"It is easy for many organizations to say that they support innovation. They discuss how they are going about it and how they are providing some resources towards the effort. But when you pull back the surface the support is lacking..." Read more > and follow @edlgoldman.



Most Recent Activity

Refresh this widget
Load more items

Meet your Intel IT Experts

Chris Peters
Business Strategist
Chris started his career as a US Navy officer in the nuclear power program before transitioning to manufacturing and supply chain management at Clairol. Technology integration and information systems were critical in these roles and made joining Intel in 2000 a natural move. Since joining Intel, Chris has worked on data center products, has spent time working inside Intel IT and is now focused on business client computing solutions.


Have a Specialty?

Expert Spotlight

Jeffery Ton

Jeffery Ton joined Goodwill Industries of Central Indiana, Inc., in 2010 to provide vision and leadership in the continued development and implementation of the enterprise-wide information technology portfolio, including applications, infrastructure, security and telecommunications across the Goodwill business units. In 2013 Goodwill created the position of SVP of Corporate Connectivity to oversee the formal and informal networks, the external and internal communication, the information and data, as well as the underlying systems. In essence, Ton has responsibility for both the marketing and technology departments.

In previous roles he has owned his own management consulting firm and was the CIO for Lauth Property Group. Prior to Lauth, Ton spent 14 years in various technology roles with Thomson Multimedia (RCA). Away from work, he and his wife enjoy canoeing, gardening and travel. He is a member of the US Green Building Council and a LEED Accredited Professional. Ton has led workshops in Green Building, Green Living, and Servant Leadership. He also spends time as a keynote speaker for civic organizations and corporations, speaking on a wide variety of topics, including Lewis and Clark, leadership and green living and business operations.

Find Ton on Twitter at @jtongici or follow his blog at and check out his previous IT Peer Network posts and discussions.

Featured Infographic

Search IT Peer Network

About IT Peer Network

IT Peer Network is a community of IT professionals looking to discover the business value of IT and share best practices and insights. For more background, read the IT Peer Network community blog.

Join the Discussion

Refresh this widget
Filter by Categories & Tags

Today's Tweets

Follow Us


Join Intel IT Center


Missed MWC? Catch up on all the news from Intel.


From the keynotes to the sessions, check out all of the highlights and coverage captured from the show floor in Barcelona on the Intel IT Peer Network.

From CIO Insight


vPro Expert Center Feed

Refresh this widget