1 2 3 Previous Next

IT Peer Network

1,179 posts

IoT innovation in India

Posted by Andrew_Moore Feb 25, 2015

If there’s one thing that sums up the spirit of India, it is this:



A man weaving between cars in slow-moving traffic — selling sun visors, peanuts, and of course, the Harvard Business Review.

I saw this on my recent trip to Mumbai and Delhi, where I was running two CIO Executive Forums. At these events, I noticed the same street-savvy, entrepreneurial flair within business and tech leaders at the board level. Executives from across the political spectrum were uniting behind the government’s agenda for change.


Both events were well attended, and the conversation focused on the major trends driving the need for IT transformation across various industries. Through these conversations, I was struck with the realization that the time gap between technologies arising in mature markets and becoming a reality in emerging markets is ever shrinking.


IoT Innovation


One of the hottest areas for reinventing businesses today is on the topic of the Internet of Things (IoT). There was a lot of discussion around the subject with a clear understanding that change and rapid innovation are critical for success. One financial services CEO said: “For us to succeed and lead the market, it’s imperative that we keep reinventing ourselves.”


Some attendees were still looking for the right business applications for IoT, or researching how to put it into action. However, I was surprised by the number of executives who were already deploying live applications or engaging in trials.


IoT Security


Within the realm of IoT, another top priority for everyone was security. We’ve already seen headlines about fridges and other domestic devices being hacked, and there’s rarely a day that passes without another headline report about the latest large scale attack — often with a “connected device” as the entry point. As the number of smart devices grows, so does the number of potential attacks on them. Many estimates on the total number of connected devices hover around 50Bn by 2020. In 2015, the manufacturing, utilities and transportation sectors will have 736 million connected devices based on Gartner estimates — all of which must be protected against misuse.


The vast number of connected things presents a challenge. But so too does the breadth of technologies involved. The point of attack can now span across a number of technology layers – including edge devices, communications networks, and the cloud. All of these layers must be protected against intrusion and abuse. Many of the advantages of the Internet of Things disappear if security is managed on an individual device basis. A scalable secure solution is a crucial and fundamental need. One way to achieve this is to use a single security policy management solution to manage the device, software and data security in the IoT gateway solution. To support this strategy, this same solution can be used for managing the IT infrastructure. This approach enables security across the IT and IoT infrastructure to be monitored and managed through a single interface — reducing the complexity of protecting a dispersed and technologically diverse network of devices, and making the security management scalable as the data gathering.


IoT Privacy


Privacy was another mentioned priority for IoT. A robust privacy safeguard will be essential to win the trust of the owners of connected devices. Applications that deliver benefits to end users often have the side effect of exposing potentially private information to a service provider. One way around that is to use anonymity features at the processor level that enable service providers to address groups of users without knowing the identity of each one. For example, a group of cars might be authorized to use specific transport information services while the provider would deliver these services without identifying individual cars and their drivers. Intel is working with other silicon manufacturers to implement this concept, to ensure that hardware-based privacy can become a standard component of IoT.


IoT Agility


One final and more general observation was around agility. It was clear from the interactions with customers that they recognize the need to innovate quickly. To stay ahead, solution deployments should be scheduled in days and weeks rather than months and years. While this kind of timeframe was previously unheard of, a foundational step towards this agility is to deploy modern infrastructure.


All in all, my experience in India was fantastic. It was a truly insightful meeting of minds on the subject on the Internet of Things. Seeing the innovation coming out of this region was exciting and I look forward to the ever-developing technology of IoT.


Find out more about how Intel can help you to implement the Internet of Things


-Andrew Moore


Clouding Around - A mini-blog series on the Cloud with Arif Mohamed

Part 1: 8 Ways to Secure Your Cloud Infrastructure


Cloud security remains a top concern for businesses. Fortunately, today’s data center managers have an arsenal of weapons at their disposal to secure their private cloud infrastructure.

Here are eight things you can use to secure your private cloud.


1. AES-NI Data Encryption

End-to-end encryption can be transformational for the private cloud, securing data at all levels through enterprise-class encryption. The latest Intel processors feature Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), a set of new instructions that enhance performance by speeding up the execution of encryption algorithms.


The instructions are built into Intel® Xeon server processors as well as client platforms includingmobile devices.


When encryption software utilises them, the AES-NI instructions dramatically accelerate encryption and decryption – by up to 10 times compared with software-only AES.


This speedy encryption means that it is possible to incorporate encryption across the data centre without significantly impacting infrastructure performance.


2. Security Protocols

By incorporating a range of security protocols and secure connections, you will build a more secure private cloud.


As well as encrypting data, clouds can also use cryptographic protocols to secure browser access to the customer portal, and to transfer encrypted data.


For example, Transport Layer Security (TLS) and Secure Sockets Layer (SSL) protocols are used to assure safe communications over networks, including the Internet. Both of these are widely used for application such as secure web browsing, through HTTPS, as well as email, IM and VoIP.


They are also critical for cloud computing, enabling applications to communicate over the network and throughout the cloud while preventing undetected tampering that modifies content, or eavesdropping on content as it’s transferred.


3. OpenSSL, RSAX and Function Stitching

Intel works closely with OpenSSL, a popular open source multiplatform security library. OpenSSL is FIPS 140-2 certified: a computer security standard developed by the National Institute of Standards and Technology Cryptographic Module Validation Program.


It can be used to secure web transactions through services such as Gmail, e-commerce platforms and Facebook, to safeguard connections on Intel architecture.


Two functions of OpenSSL, that Intel has contributed to, are RSAX and function stitching.


The first is a unique implementation of the popular RSA 1024-bit algorithm, and produces significantly better performance than previous OpenSSL implementations. RSAX can accelerate the time it takes to initiate an SSL session – up to 1.5 times. This provides a better user experience and increases the number of simultaneous sessions your server can handle.


As for function stitching: bulk data buffers use two algorithms for encryption and authentication, but rather than encrypting and authenticating data serially, function stitching interleaves instructions from these two algorithms. By executing them simultaneously, it improves the utilisation of execution resources and boosts performance.


Function stitching can result in up to 4.8 times performance improvement for secure web servers when combined with RSAX and Intel AES-NI.


4. Data Loss Prevention (DLP)

Data protection is rooted in the encryption and secure transfer of data. Data loss prevention (DLP) is a complementary approach focused on detecting and preventing the leakage of sensitive information, either by malicious intent or inadvertent mistake.


DLP solutions can profile content against rules and capture violations or index and analyse data to develop new rules. IT can establish policies that govern how data is used in the organisation and by whom. By doing this they can clarify security practices, identify potential fraud and avert accidental or unauthorised malicious transfer of information.


An example of this technology is McAfee Total Protection for Data Loss Prevention. This software can be used to support an organisation’s governance policies.


5. Authentication

Protecting your platform begins with managing the users who access your cloud. This is a large undertaking because of the array of external and internal applications, and the continual churn of employees.

Ideally, authentication is strengthened by routing it in hardware. With Intel Identity Protection Technology (Intel IPT), Intel has built tamper-resistant, two-factor authentication directly into PCs based on third-generation Intel core vPro processors, as well as Ultrabook devices.


Intel IPT offers token generation built into the hardware, eliminating the need for a separate physical token. Third-party software applications work in tandem with the hardware, strengthening the authentication process.


Through Intel IPT technology, businesses can secure their access points by using one-time passwords or public key infrastructure.


6. API-level Controls

Another way in which you can secure your cloud infrastructure is by enforcingAPI-level controls. The API gateway layer is where security policy enforcement and cloud service orchestration and integration take place. An increased need to expose application services to third parties, and mobile applications is driving the need for controlled, compliant application service governance.


WithAPI-level controls, you gain a measure of protection for your departmental and edge system infrastructure, and reduce the risk of content-born attacks on applications.


Intel Expressway Service Gateway is an example of a scalable software appliance that provides enforcement points and authenticates API requests against existing enterprise identity and access management system.


7. Trusted Servers and Compute Pools

Because of cloud computing’s reliance on virtualisation, it is essential to establish trust in the cloud. This can be achieved by creating trusted servers and compute pools. Intel Trusted Execution Technology (TXT) builds trust into each server, at the server level, by establishing a root of trust that helps assure system integrity within each system.


The technology checks hypervisor integrity at launch by measuring the code of the hypervisor and comparing it to a known good value. Launch can be blocked if the measurements do not match.


8. Secure Architecture Based on TXT

It’s possible to create a secure cloud architecture based on TXT technology, which is embedded in the hardware of Intel Xeon processor-based servers. Intel TXT works with the layers of the security stack to protect infrastructure, establish trust and verify adherence to security standards.


As mentioned, it works with the hypervisor layer, and also the cloud orchestration layer, the security policy management layer and the Security Information and Event Management (SIEM), and Governance, Risk Management and Compliance (GRC) layer.



Cloud security has come a long way. It’s now possible, through the variety of tools and technologies outlined above, to adequately secure both your data and your user. In so doing, you will establish security and trust in the cloud and gain from the agility, efficiency and cost savings that cloud computing brings.


- Arif

This is the final post in my blog series about transforming the workplace. Be sure to read part 1, part 2, part 3, and part 4.

It’s a new year, and for most, it’s a great time for change. And while this often means new fitness programs, new diets, or other personal goals, what if we made a resolution to change the way we work?

This blog series has been exploring the changing workplace, its inevitable challenges, and how technology is key to transformation. At the end of last year, I talked about making it all work by applying an integrated strategy across culture, IT, and facilities. Here, in this final installment, I want to talk about how Intel implemented real change that resulted in happier employees and, truly, a better way to work. 

SMAC: Intel IT’s Phased Approach

As with any resolution, personal or professional, taking a methodical approach with measurable benefits is key to winning the race. Intel IT took a proactive, phased journey to enabling the SMAC stack—and it’s one that is continually progressing as technologies change.

  • Social – As Intel employees became more mobile, social tools quickly expanded to connect the dispersed global workforce to facilitate people working together in the same “virtual room.” Even better, employees are happier because they can easily connect with coworkers. 



  • Mobile – With an early start in mobile starting about 17 years ago, Intel IT now supports 90K employees at 143 sites in 62 countries, so supporting a seamless collaboration experience for all employees and refining the mobile app experience are top priorities. 


  • Analytics and cloud – With a start in data management and BI, Intel IT is now moving ahead with advanced predictive analytics, machine learning, and data visualization. Cloud efforts continue to evolve as well, including a mail cloud and personal cloud storage that let employees get what they need, when they need it, on any device.


Facilities: The Way We Work

So what about the actual workspace? After all, you can have all the exercise equipment you want, but if you don’t have the right spot to use it, you probably won’t exercise. In other words, poor or less-than-apt conditions can be counterproductive.

When Intel realized its many cubicle spaces were rather underutilized due to employees congregating in meeting rooms and other spaces to simply work together, they sought to strike a balance between collaboration space and personal working space. This manifested in The Way We Work program, based on the premise that any employee will work better in an environment tuned to the way they work. The program’s guiding principles address work style, preferences, company identity, and space.

  • Optimized workspaces foster mobility, collaboration, teamwork, and problem solving. Private phone-booth rooms become virtual offices with network connectivity and HD audio and video.

  • Inviting spaces are modern and make work a place where you want to be; they capture the look and feel of the future, and showcase Intel innovation and technologies.

  • Space efficiency makes optimal use of real estate, including repurposing existing square footage to help offset costs.

JH4.pngIntel takes it a step further, extending these guiding principles to its work groups, or “communities.” Each community is assessed to determine the particular needs for individual work areas, team areas, collaboration rooms, and private phone booths.

The changing workplace marks the end of the “one size fits all” office, but it also reflects a growing union between IT and facilities. For example, the conference room table you sit at today is just a piece of furniture, but in the near term it may come with a touch-screen interface and network connectivity. At this point, is it a piece of furniture or a piece of IT equipment?

There are exciting changes on the horizon. If we resolve to embrace the innovation, we can find a better way to work.


Intel’s Vision on Workplace Transformation

Finally, be sure to read the paper that expands on Intel’s vision of workplace transformation. It captures the topic of this blog series in even greater detail.

Has your organization moved on to a better way to work? Please join the conversation and share your experience. And be sure to click over to the Intel IT Center to find resources on the latest IT topics.

Until the next time …

Jim Henrys, Principal Strategist




We’ve known  the innovators at Aerospike for a few years now, and today we are announcing more than 1 million transaction per second (TPS) on a single server with Aerospike’s NoSQL database. That might not seem like such a big deal, until you realize we are not using DRAM for this, as you’ve seen on some previous posts about Aerospike doing 1 million TPS. We are trading out DRAM for NVM (non-volatile memory) in the classic form of NAND memory. NAND to database fanatics like us is hot, because you store so much more.  NoSQL innovators have learned how to utilize NVM with breathtaking performance and new data architectures. NVM is plenty fast when your specification is 1 millisecond per row “get”. In fact it’s the perfect trade-off of, fast, lower cost, and non-volatile. The best thing is the price. Did I tell you about the price yet?


NVM today and even more so tomorrow is a small fraction of the price of DRAM. Better still you are not constrained by say 256GB, or some sweet spot of memory pricing that always leaves you a bit short of goal. Terabyte class servers with NVM give you so much more headroom to grow your business and not reconstruct and upgrade your world in months.  How does 6 + Terabytes of NVM database memory on a single box sound?

Here at Intel, we say. Be bold, go deep into the Terabyte class of database server!


So how did we do this? Well our friends at Aerospike make it possible with a special file system (often called a database storage engine), that keeps the hash to the data in DRAM (a very small amount of DRAM, we set it to 64 GB), and the actual 1k or greater (key,value) row is kept in a large and growth capable “namespace” on 4 PCIe SSDs. Aerospike likes Intel SSD for their block level response consistency, because when you replace DRAM and concurrently run at this level of process threading, consistency becomes paramount. In fact we like to target 99% consistency of reads under 1 millisecond, during our tests. Here are the core performance results.


95% read Database Results (Aerospike’s asmonitor and Linux iostat)

asmonitor data

Record Size

Number of clients threads

Total TPS

Percent below 1ms (Reads)

Percent below 1ms

Std  Dev of Read Latency


Std Dev of Write Latency (ms)

Database size

























1k with replication5121,003,47196.1199.980.870.30200G



iostat data

Record Size

Read MB/sec

Write MB/sec

Avg queue depth on SSD

Average drive latency

CPU % busy



















1k (replication)








1. Data is averaged and summarized across 2 hours of warmed up runs. Many runs executed for consistency.

2. 4k test was network constrained, hence the lower CPU attained during this test.


We ran our tests on 1k, 2k and 4k row sizes, and 1k again with asynch replication turned on. We kept the data row-wise small, which is common for operational databases that manage cookies, user profiles and trade/bidding information in an operational row structure. The Aerospike database does have a binning process that can give you columns, but so many usages exist for strings, so we configured for no-bin (i.e. 1 column). This configuration will give you the highest performance for Aerospike.


The databases we built were from 100GB to 400GB, but as made the database bigger we did not see any drop in performance. We used a small database to maintain some agility in building and re-working this effort over and over. Our scalability problems came about as we scaled the rows sizes and that was at the network level, and no longer as a balancing act between the SSD and threading levels on  the CPU. We simply need more network infrastructure to go to larger row sizes. Taking a server beyond 20Gbit of networking per server at a 4k row sizes was a wall for us. Supporting nodes that are producing 40Gbit and higher throughput rates can become an expensive undertaking.  This network throughput and cost factor will affect your expense thresholds and be a decision factor on truly how dense of an Aerospike cluster you wish to attain.


Configuration and Key Results

We used Intel's best 18 core Xeon Xeon v3 family servers which support 72 cpu hardware threads per machine. Aerospike is very highly threaded and can use lots of cores and threads per server and with htop we were recording over 100 active threads per monitoring sample, loading the CPU queues nicely. As far as balance to the SSD and queue depths of the SSD we found that achieving  our range of 95% to 100% consistency under 1 ms db record retrieval was most perfected at a queue depths of under 32 on these Intel NVMe (non-volatile memory express)  SSD’s. The numbers in the asmonitor data table shows that we were actually getting mostly 97% of all transactions running under 1 millisecond. A very high achievement.


Configuration details is below, for those attempting to replicate this work. All components and software is available on the market today. Try the Aerospike Community Edition free for download here.







Community Edition




Single Bin

Number of nodes


Replication Factor

One (*Two used with 1k rows and replication)

RAM Size

64 GB


Two P3700 PCIe Devices per node ( 4 total)

Write block Size














Example command used to load the database:

./run_benchmarks -h -p 3000 -n test -k 100000000 -l 23 -b 1 -o S:2048 -w I -z 64

Example command used to run the benchmark from client:

./run_benchmarks -h -p 3000 -n test -k 100000000 -l 23 -b 1 -o S:2048 -w RU,95 -z 64 -g 125000

Flags of Aerospike Client:

-u              Full usage

-b              set the number of Aerospike bins (Default is 1)

-h            set the Aerospike host node

-p            set the port on which to connect to Aerospike

-n            set the Aerospike namespace

-s            set the Aerospike set name

-k            set the number of keys the client is dealing with

-S            set the starting value of the working set of keys

-w            set the desired workload (I - Linear 'insert'| RU, - Read-Update with 80% reads & 20% writes)

-T            set read and write transaction timeout in milliseconds

-z            set the number of threads the client will use to generate load

-o            set the type of object(s) to use in Aerospike transactions (I - Integer| S: - String | B: - Java blob)

-D          Run benchmarks in Debug mode




Dell R730xd Server System

One primary (dual system with replication testing)

Dual CPU socket, rack mountable server system

Dell A03 Board, Product Name: 0599V5

CPU Model used

2 each - Intel(R) Xeon(R) CPU E5-2699 v3 @ 2.30GHz max frequency: 4Ghz

18 cores, 36 logical processors per CPU

36 cores, 72 logical processors total

DDR4 DRAM Memory

128GB installed

BIOS Version

Dell* 1.0.4 , 8/28/2014

Network Adapters

Intel® Ethernet Converged 10G X520 – DA2 (dual port PCIe add-in card)

1 – embedded 1G network adapter for management

2 – 10GB port for workload

Storage Adapters


Internal Drives and  Volumes

/ (root) OS system – Intel SSD for Data Center Family S3500 – 480GB Capacity

/dev/nvme0n1 Intel SSD for Data Center Family P3700 – 1.6TB Capacity, x4 PCIe AIC

/dev/nvme1n1 Intel SSD for Data Center Family P3700 -  1.6TB Capacity, x4 PCIe AIC

/dev/nvme2n1 Intel SSD for Data Center Family P3700 -  1.6TB Capacity, x4 PCIe AIC

/dev/nvme3n1 Intel SSD for Data Center Family P3700 -  1.6TB Capacity, x4 PCIe AIC

6.4TB of raw capacity for Aerospike database namespaces

Operating System, kernel

& NVMe driver

Red Hat Enterprise Linux Server Version 6.5

Linux kernel version changed to 3.16.3

nvme block driver version 0.9 (vermagic: 3.16.3)





































Note: Intel PCIe drives use the Non-Volatile Memory express storage standard for Non-volatile memory, this requires an NVMe SSD software driver in your Linux kernel. The currently recommended kernel is 3.19 based for work such as this, benchmark results.


PCIe NVMe Intel drives latest firmware update and tool

Intel embeds its most stable maintenance release support software for Intel SSD’s into a tool we call Intel Solid State Drive Data Center Tool. Our latest release just landed and it important that you use the MR2 release included in the latest version 2.2.0 to achieve these kind of results for small blocks.  Intel’s firmware for the Intel SSD for Data Center PCIe family gets tested worldwide by hundreds of labs many of them directly touched by software companies such as Aerospike. No other SSD manufacturer is as connected both in the platform and in the software vendor collaboration space as Intel is. Guaranteeing you the Solutions level scalability you see in this blog. Intel’s SSD products are truly platform connected and end user software inspired.




The world of deep servers that dish out row-based Terabytes has arrived, and feeding a Hadoop cluster or vice-versa from these kind of ultra-fast NoSQL clusters is gaining traction. These are TPS numbers never heard of in the Relational SQL world from a single server. NoSQL has gained traction as purpose built, fast, and excellent for use cases such as trading, session and profile management. Now you see this web scale friendly architecture move into the realm of immense data depth per node. If you are thinking 256GB of DRAM per node is your only option for critical memory scale, think again, those days are behind us now.


Come see Holly Watson, and Frank Ober at Strata + Hadoop World at the Intel Booth #415. We’d love to talk to you more about our NVMe SSD’s and how open industry standards are changing the future of databases and the hardware you run them on.


Special thanks to Swetha Rajendiran of Intel and Young Paik of Aerospike for their commitment and efforts in building and producing these test results with me.

wearables.jpgConsumers can be a bit finicky when it comes to wearable technology. While wellness wearables have made a small impact in the consumer market, not much else has. Consumers face device fatigue, investment justification, fashion judgments, and a profound lack of benefits. Endeavour Partners studied early wearable adopters and found that of the U.S consumers who purchased an activity tracker, more than half no longer use their device. More over, one-third of those surveyed stopped using the device within six months of receiving it.


The real value proposition is in the enterprise, and 2015 is poised to be a year of change for wearables in the workplace. These devices allow for real-time access of data while freeing the hands for more tactile work, in turn giving the enterprise valuable information. “Wearables can help improve employee efficiency, enhance training and ongoing communication, reduce nonproductive time and rework, shrink decision time frames, minimize exposure to hazardous conditions, decrease travel time and more,” according to Accenture Technology. Companies have the opportunity to streamline training and the decision-making process by having real-time access to employees, which might be especially useful in fieldwork and manufacturing.


Wearables have the potential to disrupt every industry, but currently only 3 percent of companies are investing in enterprise wearables. In its Digital IQ study — slated for release in fall 2015 — PwC reported on the top five industries that have adopted wearables thus far: healthcare (10 percent), technology (7 percent), automotive (6 percent), industrial products (5 percent), and business and professional services (4 percent). In order to stay competitive and relevant, companies need to take notice of wearable technology and how it can positively impact their bottom line. Giants like Salesforce have already set the pace with the Salesforce Wear initiative, and the Apple Watch and Microsoft’s HoloLens are hot on their heels.


Unobtrusive Wearables


For wearables to be successfully adopted into the workplace, companies will need to plan for the following considerations: user experience, workflow modifications, analytics, IT infrastructure, privacy and security, and battery life. “To succeed,” according to PwC, “wearables must first and foremost be human-centered—that is, designed to meet the needs of the user without getting in his or her way.”


Workplace wearables might still be in the “clunk” phase, not unlike their technological predecessors — think flip phones, pagers, Bluetooth headsets, etc. While the challenges are there, so is the technological capabilityto refine design, utility and functionality. This technology will continue to evolve as our offices become wire-free, deskless, and remote.  The possibilities are simply endless.


Check out our Make it Wearable campaign for more information on how we’re transforming the wearable market.


To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

Humans are creatures of habit. Whether it’s the tall, non-fat, extra foam vanilla latte that you can’t start your day without, the route you walk to work, or “your” table in the cafeteria, I’ll bet you follow a whole host of little routines every day. They put you at ease, remove the need to worry so much about what might be around the corner, and generally help you get through the day a little easier.


MB.pngI’m not knocking it; I like my coffee as much as the next guy. But when this love of familiarity starts to infiltrate the way we work, it can become dead weight. So today, in this fourth blog in my series exploring the impact of the third Industrial Revolution on the financial services industry, we’re going to look at cultural change.


The Revolution from Within


As we’ve seen already, if you want to succeed in this new world of SMAC stack, cloud, and big data analytics, it’s essential that your business remains innovative and agile. You need to be able to change direction quickly when customer demand or the market calls for it. While driving up business velocity, organizations also need to optimize their productivity — there is no point in moving faster if you have to do everything twice. It’s also becoming increasingly important to attract and retain the best and brightest talent — including Millennials and Digital Natives — and to figure out ways to unlock the hidden intelligence buried in your organization.


At Intel, we are working on initiatives with the HR, IT, and facilities teams — both internally and at our customers’ sites — looking at how we can transform workplaces while inspiring employees to get on board with the changes. We’re exploring collaboration, facilities, and personal productivity to help change-wary workers see how they can benefit from more connected and efficient processes. Even simple things can make a big difference, like identifying the person who has the information you need for a project, finding a free conference room, or eliminating wires.


Untitled.pngCreating a Better Way to Work


We have an exciting workplace transformation roadmap in place, with new user experiences for enterprise devices, which we shared at our Intel Developer Forum in San Francisco in September 2014. The new features include integrating wireless display technology, wireless docking, wireless charging, and our You Are Your Password concept. This is a multifactor authentication model that uses biometrics, your phone, and your badge to identify you, dispensing the need for employees to remember passwords.


Indeed, 2015 is set to see a lot of exciting advances and activity in this area. The key to making the most of them lies in making sure you communicate the benefits clearly to your teams, and share best practices with them. Once they see that by making a process more efficient they can spend an extra five minutes speaking to customers (or at their daily tai chi session), they’ll soon jump on board.


Tune in soon for the last blog in this series, where I’ll be discussing the importance of IT security for fostering trust in your brand’s identity.


To continue the conversation, let’s connect on Twitter.


Mike Blalock

Global Sales Director

Financial Services Industry, Intel


This is the fourth installment of a five part series on Tech & Finance. Click here to read blog 1, blog 2, and blog 3.

CP.pngArchaeologists dig up the earth looking for items from yesteryear; they find raw, untouched data — ecofacts, artifacts, architecture, tombs — and analyze it, hoping to uncover a snippet of past cultures or some buried treasure.


This is not unlike how organizations approach unstructured data, or dark data. Dark data represents a pooled set of untapped facts, documents, and media that are stored and sit undisturbed until we dig at it, hoping to find those valuable gems in all the clutter that can give us opportunities for prediction and help us better understand the culture, strategy, or bottom line of our enterprise.


Dark Data Is Appearing – and Disappearing – at an Alarming Rate


With all the digitization of data we’ve seen since late in the 20th century, we’ve got a data flood on our hands: In 2012 alone, we created 2.5 quintillion bytes of data per day. That number has continued to grow at unprecedented rates since then. It’s estimated that in the next decade, a whopping 90 percent of all the data created will be unstructured, which is defined by Gartner as “the information assets organizations collect, process and store during regular business activities, but generally fail to use for other purposes.”


A subset of this unstructured data will soon come from the growing popularity of the Internet of Things (IoT). According to Randy Bean at MIT Sloan Management review, data generated by “things” is projected to grow from 2 percent in 2013 to 10 percent in 2020. Real-time data generated from the IoT will add a unique spin on the management of unstructured data, as the enterprise will need to start to find ways to use, process, and analyze this device-generated data as it occurs.



Finding and Using Dark Data Before It’s Archived


Dark data is not organized in a predefined, relational model database, like its structured counterpart. It’s variable and rich, and contains word processing documents, social media posts, images, presentations, emails. The majority might be digital noise, but by linking unstructured and structured data, there is real opportunity to make sense of this vast amount of information and unearth new intelligence.


Before we go on a digging expedition, however, it’s important to establish a system that can help your business analyze and create context around your dark data. An archeologist only begins excavation once he or she formalizes an objective and surveys the land. Find out how much data you have and where it is. Find out what types of data you have. Find out what types of data should be destroyed, kept for further analysis, or migrated to a less expensive facility.


At Intel, we’ve employed a multiple platform strategy for analyzing different data types including an EDW platform, Apache Hadoop platform, and low-cost Massively Parallel Processing platform. The Apache Hadoop platform is designed to process big batches of unstructured data. The Hadoop clusters work well with unstructured data since it acts as a cheap storage repository where potentially valuable data can be stored until a strategy can be implemented for its use.


While mining unstructured data can be a costly venture, it can deliver incredible value by pointing to trends that can cut cost, boost productivity, improve your ROI, and ultimately give you deeper insight into your organization.


For more resources on big data and predictive analytics, click here.


To continue this conversation, or to react to the topic, connect with me at @chris_p_intel or use #ITCenter.

"Capital isn't that important in business. Experience isn't that important. You can get both of these things. What is important is ideas." - Harvey Firestone


The world is in need of a better way to work.


We come to the office with the expectation of getting things done. The workplace is supposed to be a technology-fueled hub for collaboration, connection, problem solving, and simplified information sharing. It’s where we collectively drive innovation in hopes of building a greater enterprise.


But when technology impedes that productive environment, employees lose valuable time. Frustration mounts, workflow is disturbed, ideas lay dormant, collaboration grinds to a halt. And inevitably, the business loses money.


It’s time to change that. It’s time for the business and technology to align to build a more constructive and inspiring environment. It’s time to talk about how Intel 5th Gen Core™ vPro™ processor based systems will change the way we work.


The Power of 5th Gen Technology


When we imagined 5th Gen, this is what we saw — a workplace that is operating at the speed of your employees. A wire-free space that emphasizes mobility, flexibility, and maximum productivity without compromising user satisfaction. Technology that’s as fast as it is secure and manageable, that allows for greater information sharing and the delivery of actionable insights to the enterprise.


With Intel Wireless Display (Intel WiDi), Intel Wireless Docking, and the optimized performance of the 5th Gen Core vPro processor, we’re striving to bring you the workplace of the future. No cords and longer battery life mean more freedom and mobility. Wireless sharing capabilities lead to smoother teamwork. And our processor is equipped with Intel Identity Protection Technology that keeps users safe and secure.


The 5th Gen Core vPro family was built with a simple concept in mind: Technology should always act as an economic driver for your business. It should always be an enabler, never an inhibitor. Our desire was to minimize delays in daily processes so that employees can focus solely on the task at hand and maximize the probability of good ideas coming to light.


The Power of the Idea.


If your employees are hindered by their devices, they’re not providing their fullest value to your organization. Think of the increments of time spent on menial tech-related tasks that detract from overall productivity. The tiny actions that add up to tremendous loss and detract from your business’s bottom line. The ideas that lay sleeping for so long they were eventually forgotten.


Now think of what the 5th Gen Core vPro family could bring to your business. It’s time we change the way we work; join me as we enter the next phase of business computing and collaboration, and a new world of ideation for the enterprise.


To continue this conversation on Twitter, please use #ITCenter or #WorkingBetter.

Could a marketing guy really represent the future of digital leadership?


I remember the day that I met Charlie Cole. We were both at an executive networking event and just happened to cross paths.


We had absolutely nothing in common.


He was running digital marketing for Lucky Brand Jeans. I was running an IT consulting firm.


But for some reason, we hit it off. There was something that told us we shared some common values. I don’t know, maybe I just liked him. But in either case, we’ve stayed in touch over the years and I’ve followed his career with both great interest and fascination as he rapidly transformed himself.


That’s why it was so exciting for me to be able to travel to New York a few weeks ago to interview him for the Transform IT Show.


The Fusion of Business and Technology


I was there to tell the story of his rise from “marketing guy” (as he was when I met him) to high tech CEO of a new startup called The Line. And I think that his story is really the story of our future because he represents what I believe the modern digital leader will look like.


I believe that modern digital leaders are going to be part IT, part business, and it won’t matter which side of the house you started on. As the lines between business and technology blur, we’re going to need to embrace both sides of the equation if we want to get the job done.




As we talked, Charlie shared some great insights with us. The days of IT as an order-taker are over. The divide between the CIO and the CMO is a myth. We need to instead come to solve problems together. These were all truths that were exemplified by Charlie’s own journey.


Understanding Your Role as a Digital Leader


If you’re an IT professional, I think Charlie demonstrated to us what the modern business leader is going to look like: someone who understands technology intrinsically. You’re going to have to speak his language if you hope to survive, because he is embracing technology in all of its forms in his quest to change the game for his company.


And if you’re a business professional, this is what your future will look like. You can’t abdicate your responsibility to understand the intricacies of how technology will impact every facet of your business. The only way for us to thrive will be to do it together. And we can’t do that unless each of us becomes intimately familiar with the whole package.


But perhaps it was his last bit of advice that really told us how to do that.


He challenged us to embrace our own awesomeness and to accept our weaknesses. To spend more of our energy focused on improving our strengths and less on improving what we aren’t good at. It is advice that has the power to change everything about how you approach your career and your life.


Those two challenges may seem at odds with one another. Embrace the other side of the equation, but also accept your weaknesses and your strengths. But if you think about it, it’s really a recipe for the balance and humility that we will each need to come to the table open, informed, and comfortable with our own role. And I think that it will be one of the keys to finding your own pathway to success as you strive to become a digital leader.


Watch the full episode of The Transform IT Show with Charlie Cole and watch the latest Hangout where we discuss our biggest takeaways from the show. The first 15 people to RSVP will be eligible for a free copy of my book, “The Quantum Age of IT.”

“Shared information multiplies its value, hoarding information diminishes it. Increased transparency not only helps to share the information, but builds trust.”

David Coleman, CMS Wire


A Shift in Mindset


Enterprise collaboration is more than just bringing people together; it’s about enabling people to work better together and to deliver business results faster. David Coleman discusses what he calls the collaboration shift: “The collaborative shift is a shift in mindset. It incorporates attitudes, morale, culture, relationships and more, but fundamentally it's a paradigm shift in the way you think about work. It includes considering the ‘we,’ as well as the ‘me.’”


But why change? A colleague of mine always references David Weinberger, who stated, “The smartest person in the room is the room.” You may be an expert in your domain, but learning new things and solving problems alone takes time. There is a definite business advantage in being able to find experts with the knowledge you need hiding within your organization or somewhere in the company. It’s going to be faster to leverage their expertise when you need it.


KG.pngApplying the Collective Wisdom


CIOs are starting to think strategically about collaboration. Collaboration itself needs to be a strategic initiative, one that can be integrated into all of the services that IT provides. Why wouldn’t you want employees to be able to work better and with greater velocity through having access to the collective wisdom of “the room?”


It sounds easy, but it’s hard to apply what you don’t know firsthand. IT itself has to embrace collaboration and work more collaboratively to apply some of the social collaboration concepts to solve problems.


As an example, I helped a small team learn about crowdsourcing and apply it in IT to solve a problem. Together, we designed an IT cost-cutting idea jam using our internal collaboration platform to source and collaborate on new ideas from IT employees. We explained that employees delivering services are in a great position to understand the details of how IT really works and that we needed their valuable insight. Over a three-week period, a community formed. IT employees submitted ideas, reviewed their peers’ ideas, and commented or asked idea owners for clarification, which further developed the ideas. Employees were able to vote the ideas up or down.


By the end, IT employees were engaged — they provided 98 new ideas on how to cut costs from the ground up. Then, the small team I worked with facilitated virtual discussions with the top idea owners and facilitated a lot of matchmaking so people with similar ideas could collaborate. Many of the crowdsourced ideas are being implemented today.


This IT idea jam community learned how to apply crowdsourcing and use social collaboration to move their own organization forward in a cost-effective, productive way. They are now advocates of collaboration. And there are efforts like this happening every day across the company. People are starting to see the value in collaboration.


Best Practices


At Intel, we’re focused on enhancing our existing collaboration experience to increase the velocity and to leverage knowledge of the whole organization. We are connecting employees around the world to each other and to content they would have never had visibility to in the past. It’s about breaking down organizational, geographical, and hierarchical barriers so that employees can solve problems together.


The modern workforce, especially Millennials, has learned to expect a high level of feedback and social interaction when online. Leveraging social actions like shares, mentions, or voting, or introducing gamification, can help boost productivity, performance, and engagement.


It’s been a journey. We have made substantial improvements as we evolve in these four areas: integration, security, engagement, and mobility. Our goal is to make sure our employees and business partners can collaborate easily and effectively so that business opportunities are not missed.


And we are in the midst of that shift in mindset, a collaboration shift.

To continue the conversation, please follow me on Twitter or use #ITCenter.

SSD.pngIntel provides tools based on our lines of products and the target, which is Consumer, Professional (IT client), and Data Center. In my area, Data Center, we provide SATA and PCIe based SSD's and for all customers of those Data Center capable products, we provide a single command line tool called the Intel SSD Data Center Tool. Intel provides a global  download center that provides access to most of Intel product related software, drivers and tools. We wrap our bundled drive firmware fixes into these Data Center Tool product releases and of course new features of the tool itself, like an example might be a new Administrative function of the NVMe storage protocol in upcoming releases.


Intel provides essentially the same command line features on each version of the tool for the two popular OS platforms, Windows and Linux. However on Windows you need the Intel NVMe driver, which has administrative capability, for complete usage. The Windows Driver from Intel also gets rigorous testing inside Intel Labs and with software partners, so of course we recommend it. I provide a link to that download below since it relates. On Linux, NVMe drivers should be gained from your Linux OS distribution vendor (e.g. SLES and Red Hat). You can see my other blogs for more detailed "go it alone" help on Linux and the NVMe driver stack.


Please download this Tool and update the newest PCIe (*for P3700 and P3600 products) firmware, if you are seeing issues. Update things like BIOS and drivers and firmware based on need, not on whim.  I have an upcoming blog on some work done with this latest release of the PCIe firmware on a NoSQL workload that is quite impressive at 1k and 2k block sizes. Stay tuned for that... Intel NVMe SSD's can show some very impressive numbers on very small block I/O now, and small block is the staple of very fast databases in the NoSQL arena so that area is important to the rigors of the world's fastest databases, and definitely worth a mention if you are using very small database records.


Version 2.2.0 of Intel SSD Data Center Tool.

Intel® Download Center

NVMe Administrative driver for Windows 2008 and Windows 2012 Server Editions

Intel® Download Center


Have fun with it, I am very impressed with our latest firmware, and I am a end user tester, not the authoring team of the firmware.  So I have to be convinced by the numbers. I have been convinced, some of our biggest software partners have been convinced. Try it out!


The Evolving Workplace

Posted by ChrisPeters Feb 6, 2015

Ever since visiting my father’s office as a small child, I realized the importance of personalizing my workplace. Seeing pictures of the family on his desk, diplomas on the wall, and the surrounding library of books communicated who he was and how capable he was at his job.


Mixed with those personal effects were productivity tools: whiteboards, inbox/outbox, paper, printers, Rolodexes, various office supplies, and the ever-abundant Post-it note. Today, much of this clutter has been automated by modern applications, technology and devices — and this is especially true when I consider social media tools. I found the cartoon below summed up the integration of these desk-based productivity tools nicely.

We’ve significantly reduced our desktop clutter through digital devices and applications. As a result, many of the traditional workplace effects are now relics of the past, replaced with new tools that house photos, collect our notes, organize our contacts, send our communications, and so on.


As new digital tools are introduced and virtualized workspaces continue to evolve, the traditional desk setup will evolve again. If we focus on what is cluttering our desks today, why do we still deal with the mess of tangled wires and plugging and unplugging from the devices needed to get our work done?

C:\Users\cppeters\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.Outlook\HM188KTF\Screenshot 2015-02-05 15 00 01.png C:\Users\cppeters\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.Outlook\HM188KTF\Screenshot 2015-02-05 15 00 11.png

Last week, Intel introduced its 5th Generation Core processors, which offer a wire-free work experience and mobile collaboration tools, resulting in an improvement in work productivity. Thinner, lighter devices will leave us free and flexible while working on the go — all day long, with battery life topping eight hours on a single charge. A secure wireless docking experience and wireless display experience paired with new hands-free voice command technology take the hassle out of working on the go.


I found this Intel IT Center white paper highlighting modern collaborative technologies to be helpful in showcasing the future of work technology and experiences. As a remote worker, conventional office accessories prove ineffective and unwieldy for my daily tasks. The modern collaboration technology I look forward to most is the shared virtual interactive whiteboard.


What technology do you want that can provide you a better way to work?




To continue this conversation on Twitter, connect with me at @chris_p_intel or use #workingbetter.

Joel.jpgDesigning a new server room may initially seem to be a daunting task, there are after all, many factors and standards to consider. However, setting up the space and equipment doesn't have to be an ordeal as long as you plan in advance and make sure you have all the necessary items. Here’s a checklist to facilitate the design of your data center.

Spatial Specifications

  • Room should have no windows.
  • Ensure space is large enough for future growth
  • Ceiling should be at least nine feet
  • Should have drop ceiling return to exhaust heat


Equipment Specifications


  • Computer racks should have a clearance of at least 42 inches.
  • All racks should have proper grounding and seismic bracing.
  • Computing equipment should have a maximum electrical intensity of 300 watts per square foot.
  • Server room should contain fire, smoke, water and humidity monitors.


Cooling Specifications


  • Racks should be arranged in a hot-aisle/ cold-aisle configuration.
  • Use cooling equipment with variable speed fans.
  • Plan for redundancy, do not rely on building cooling for back-up.
  • Under floor cooling systems require a raised floor with a minimum height of 24 inches, with the ability to hold the weight of server racks and equipment.

Electrical Systems Specifications

  • Computer equipment and HVAC should have separate power panels.
  • There should be no heat-generating support equipment.
  • Electrical systems should have an isolated ground, grounding grid and dedicated neutral.
  • Separate back-up power should be available for data center.
  • The electrical system should have a shunt trip for purposes of emergency shutdown.


Data Center Resources has had a reputation for providing superior data center solutions since 2002. Our dedicated team understands that while the solution is important, it is only a part of the overall relationship with our clients. Responsiveness, after sale service, ease of purchasing and breadth of product offerings are other important factors, and we are committed to exceeding expectations in all of these areas. Our principals and project specialists each have several years of experience in providing technical environment solutions.  Contact our team today to find out how we can help you design a new server room.

McMahon.jpg2014 was another challenging year for the CIO with plenty of column inches given over to debating the control and usage of technology across the enterprise with much speculation about the validity of the role itself.

Personally, I think talk of the demise of the CIO role is presumptuous though what is critical right now is that the CIO role needs to evolve with 2015 being the time to flourish and show their true worth in helping set the strategic direction of their organisation.  The CIO role is like no other in that it allows visibility across the organisation that others rarely get to achieve and those that are commercially astute with a capacity to add tangible value to the business will excel - those who are not will likely be sitting in a different chair at the start of 2016.

As a result of the recent economic turmoil and rapidity of change across the commercial landscape, many organisations are now looking for a different type of CIO or technology leader than they have in the past. They are diluting the need for a more technically focused individual to one who is able to unravel the complexity of IT, increase the accessibility to technology, and be open to new ideas with the ability to work with peers on getting the right things done.  One of the key factors in this evolutionary change in the CIO role is the need to understand and appreciate they no longer have ultimate say over what technologies are used within their organisation but they will still be held accountable for making sure it all works.

Gartner research has shown that 38% of IT spend is already outside of IT and that they expect this to reach 50% by 2017.  This is going to send a shiver down the spine of many a CIO but they must understand the diversification of technology usage and need across their organisation.  This is quite the culture shift for many who have migrated in to the CIO role from the traditional ‘lights on’ IT director role of old but this will make absolute sense for those who have the ability to evolve in to this new model which will free them up to get more involved in defining and executing the ‘big picture’ strategy.  Too long the CIO has been identified as the strategic and commercial weak link in the c-suite and not adding tangible value across the business – they must seize this opportunity to transform their role and reputation in to one that thinks collectively, understanding how best to resolve the issues that matter across the business and ultimately delivering commercial value.


The main theme and focus for many of us this year is how to transformand drive a digital business.  Naturally this is a hot topic for CIO’s and the challenge of how to implement and transform your business to a digital operating model is now top billing on the agendas of many boardrooms across the globe.  This is exactly where the CIO can step up and work with peers and key stakeholders across the business to define a strategy which is moulded around a ‘customer first’ approach where digital technologies will form the cornerstones of how your services are delivered and consumed going forward.  This will require much managing of change, process, and incumbent technology and possibly need a marked change in strategic direction – a role tailor-made for the commercially astute CIO in harness with the CMO.

The impact of digital business on industries and individual organisations cannot be underestimated and Gartner have predicted that by 2017 one in five industry leaders will have ceded their market dominance to a company founded after 2000.  This is a bold claim but one which I support as no longer can you rely on historical dominance of your sector – either embrace disruption now or start planning your burial in the corporate graveyard alongside luminaries such as Kodak and Blockbusters.


CIO’s must embrace a “Bi-Modal IT” mind-set where they simultaneously embark on the digital transformation journey whilst maintaining Business as Usual (BAU) services.
It’s no secret that the most successful CIO’s are those who are able to run the business and transform it at the same time. Many industry observers and consultants will tell you that they have witnessed more transformation in the last 3 years than in the previous 20 years combined, so this shows how important these skills are in the modern CIO.  I don’t see any lessening in this pace as the demand for new and simpler ways to consume data, information, products and solutions is only going to increase year on year as the technology and accessibility to it improves.


CIO’s will also need to start concentrating on what talent they need to bring in to their organisations this year to manage this “Bi-Modal IT” approach as the market for the best talent is already stretched and growing ever more taut.  CIO’s should help their business colleagues and the CEO think outside the box to imagine new scenarios for digital business that cross companies and industries, providing a great opportunity for CIO’s to amplify their role in the organisation.


Gone are the days where you can supply rigid corporate systems, which are only accessible on site – the corporate world has evolved and everyone wants to consume technology in different ways with previously inaccessible data being lusted after to analyse for new operational and commercial insights.


CIO’s need to help create the right mind-set and a shared understanding among key decision makers in the enterprise – to help them "get" the possibilities of digital business.
They must take a leadership role in helping their organisations change their mind-set to what's possible – and what's inevitable in a digital business future.


This should not be done in isolation or be detrimental to any key relationships such as that with the CMO as it’s imperative you work together and deliver the ‘right’ digital strategy for your organisation.


Get yourself in the digital driving seat and don't become a passenger.  It’s going to be a busy year with a fair amount of turbulence, so buckle up and enjoy the ride.



Mike1.jpgAs many financial service organizations are discovering, there’s a new currency in town and it’s unlike anything we've ever dealt with before. The more of it you have, the more each piece is worth. And many banks and other financial institutions are sitting on huge stocks of it, and yet they have still failed to realize any returns.

This new currency is data.  Today I’m continuing my exploration of the Third Industrial Revolution by taking a look at analytics. Because it’s not just about how much data you have, but whether you can extract real value from it.

Financial services is a data-driven enterprise. Banks manipulate and process data like a manufacturing company processes raw materials. It’s no surprise that almost every financial services organization I have spoken to in the last year has identified big data and analytics as their top priorities. While it's clear that understanding this data is critical, many still struggle with what to do and how to do it.

Learning to Manage Volumes of Data

Intel recently sponsored a report on Big Data Cases in Banking Securities, created with the STAC Benchmark Council, which looked at the big data/analytics use cases common in both investment and retail banking today. Among other things, the report revealed a mix of approaches with some organizations using big data to do old things faster or better, and others using it to do completely new things. Of the famous three Vs, volume was found to be the most challenging issue among participating financial organizations.

To avoid being overwhelmed, a good first step is to narrow the focus to the top two or three use cases that will provide the most value or impact on the business. In my view, these are the three pillars of big data/analytics workloads in financial services which represent the greatest opportunities for investment:

1.      Risk management and portfolio optimization: A consolidated view of data across the enterprise that is required by regulatory requirements. This touches areas like enterprise credit risk reporting, securities fraud early warning, credit card fraud detection, and anti-money laundering.

2.      Customer engagement optimization: Achieving a 360-degree view of the customer (both consumer and business) with personalized and contextual information to enable targeted cross-selling and up-selling.

3.      Increasing operational efficiency: Using big data to improve internal processes and drive incremental innovation in areas such as modeling branch behavior or IT operations analysis.

When bringing big data analytics to one of these areas, there’s a lot to consider. How much will it, and should it, cost? How can companies hire the right data scientists? And how can financial services companies cope with the volume, velocity, and variety of data, and develop usage models that will help drive insight from it?


Mike2.jpgEmpowering Customers to Leverage Analytics

Our goal when approaching these areas with our financial services clients is to help create an open, interoperable analytics infrastructure and data platform that will empower them to develop the solutions, approaches and processes that will work for them and their customers. In addition to core platform technology like CPUs, solid-state drives, networking, fabric, and security, we also encourage them to think about easier implementations and management (i.e. using analytic data management software such as Cloudera, which is based on the open-source software Hadoop). Using standards-based architecture helps with the recruitment challenge, and also helps to reduce up-front and ongoing technology costs.

As a data-rich financial organization, you need to think of big data, analytics, and enabling technologies as your new toolkit. They’re just as important as your online banking platform, your CRM software or your sales database. In fact, it's the piece that will bring all these disparate elements together and help you extract maximum value from your data currency.


Let's continue the conversation on Twitter: @blalockm

Mike Blalock

Global Sales Director

Financial Services Industry, Intel

This is the third installment of a five part series on Tech & Finance. Click here to read blog 1, blog 2, and blog 4.

Filter Blog

By date:
By tag: