1 2 3 Previous Next

IT Peer Network

1,240 posts


If you've wondered how many virtual machines (VMs) you can deploy in a single rack, or thought how you can scale VMs across an entire enterprise then you may be interested to know what Intel and VMware are doing. While not all enterprises have the same level of scale, there’s no doubt that technology around hyper-converged storage is changing. What you may not realize, though, is there are changes in the way servers and storage are being used that impact scale to the needs of medium and large enterprises.

Virtualization and the Storage Bottleneck

Many enterprises have turned to virtualized applications as a way to cost-effectively deploy services to end users; delivering email, managing data bases, and performing analytical analysis on big data sets are just some examples. Using virtualization software, such as that from VMware, enterprises can lower IT cost of ownership by enabling increased virtual machine scalability and optimizing platform utilization. But as with any technology or operational change, there are often implementation and scaling challenges. In the case of virtualized environments, storage bottlenecks can cause performance problems, resulting in poor scaling and inefficiencies.


All Flash Team Effort

The bottleneck challenge involves scaling the adoption of virtual machines and its infrastructure, all while providing good user performance. Such problems are not just faced by larger enterprise IT shops, but small to medium business as well. Intel and VMware teamed together to deliver a robust, scalable All Flash Virtual SAN architecture.


Using a combination of the latest Intel® Xeon® processors, Intel® Solid State Drives (SSDs), and VMware Virtual SAN, SMB to large enterprise customers are now able to roll out an All Flash Virtual SAN solution that not only provides a scalable infrastructure, but also blazing performance and cost efficiency.


Technical Details - Learn More

The world’s first 32 node All Flash Virtual SAN using the latest NVMe technology will be displayed and discussed in depth during EMC World in Las Vegas May 4-7. The all flash Virtual SAN is built-up with 64 Intel® Xeon® E5-2699 v3 processors each with an Intel® SSD DC P3700 Series NVMe cache flash drive fronting 128 of Intel’s follow-on 1.6TB data center SSDs. Offering over 50 terabytes of cache and 200 terabytes of data storage, it produces an impressive 1.5 Million IOPS. This design will surely impress the curious IT professional.


Chuck Brown, Ken LeTourneau, and John Hubbard of Intel will join VMware experts to showcase this impressive 32Node All Flash Virtual SAN on Tuesday, May 5 and Wednesday May 6 from 11:30-5:30pm each day in the Solutions Expo, VMware Booth #331. Be sure to stop by and speak to the experts and learn how to design an enterprise scale, all flash Virtual SAN storage.

Wow, I can remember my first day at Intel in 1987 (when I was a very young engineer) where I was assigned a cubicle on the same floor as Gordon Moore. I learned about Moore’s law which was celebrating 20+ years at the time.  How amazing that it is still true!


Who would have thought that it would survive now 50 years (!). Intel and Atos have teamed to deliver Workplace Transformation solutions based on Moore’s law. Over three billion people worldwide are using computing devices today, and this figure is consistently rising.


Computers are playing an increasingly significant role in our lives, and every year they’re getting faster and more powerful, addressing consumer needs at all levels. But what is driving these developments and how can we ensure we keep up with end-user demands?


The answer is Moore’s law and my good friend, John Minnick, has posted an insightful blog on the topic….Happy 50th Moore’s law!!!!



Excerpt blog from Ascent Atos with permission from Author John Minnick: http://ascent.atos.net/tick-tock-fuelling-customer-experience-processing-power/


Tick Tock – Fuelling the customer experience through processing power

by John Minnick


Over three billion people worldwide are using computing devices today, and this figure is consistently rising. Computers are playing an increasingly significant role in our lives, and every year they’re getting faster and more powerful, addressing consumer needs at all levels. But what is driving these developments and how can we ensure we keep up with end-user demands?

Increasing processing power

Moore’s Law states that the density of transistors in an integrated circuit, or (micro)chip, doubles roughly every two years (figure 1). In essence, transistors switch small electric currents into larger ones, so you can imagine what happens when you put billions of them together. This is why we have the incredible compute capability and processing speed we see in microprocessors today.


Using the ‘Tick Tock’ mantra to improve the end-user experience

Intel provides microprocessors, and we’re working with them to ensure the development process consistently improves the end-user experience. My role in this involves running a series of tests, ranging from pure performance-based testing of the central processing unit (CPU), to looking at input/output, 3D graphics, memory, image filters and more, leading to an overall score for each system.  Using a testing methodology combining many factors is important: what good is processing power alone if people don’t need it? Because of the transistor density available today, we are able to have both – hugely powerful devices that live up to the expectations of increasingly digital consumers, such as servers, desktop workstations, laptops, tablets, smartphones and Ultrabooks.


Depending on personal requirements, some individuals will require more capabilities and processing power than others. In computing, there are three distinct types of requirement: executive or highly mobile worker, mainstream and workstation. For executive, the CXO needs access, mobility and security from their device – mainly to communicate, i.e. send e-mails, monitor dashboard metrics and access back-end data centers from anywhere in the world. At the other end, the engineering or analyst workstation user is typically more stationary, using much more of the compute power when it comes to applications and workloads.  With recent developments in compute capabilities such as the Ultrabooks, the lines are becoming blurred. With the ever increasing demand to use computers in mobile environments we will continue to see new platforms and new applications tested as they enter the market.


Fig. 2 – The three types of user devices: scores on performance, power and user experience

To ensure developments being made are in line with customer feedback, Intel typically releases one to two new processors per year based on their ‘Tick Tock mantra’. Here, the ‘Tick’ brings in new features and capabilities and the ‘Tock’ takes these features and improves their performance. This Tick Tock mantra is used because it allows a product to be brought to market quickly, without having to wait for all the new features to be 100% tuned at optimal performance levels. By releasing regular updates, through rigorous testing and listening to user feedback, we ensure that the updates are focused on addressing actual customer demand, at a time when it is required.

Forging alliances to bolster expertise

Working closely with Intel gives my team insight into how processors have evolved over the years, and offers us an opportunity to ensure the products are being updated in line with customer requirements. We make recommendations about when it’s appropriate to move to the next processor, how it affects compute functions or even the business.

In the global economy, companies benefit from forging alliances to bolster their expertise. When this process allows you to get closer to the client experience, that’s when you know you’ve hit a winner.

Originally posted on ascent.atos.net on April 16, 2015 by author John Minnick: http://ascent.atos.net/tick-tock-fuelling-customer-experience-processing-power/


To continue the conversation, let's connect on Twitter:

Rhett Livengood, Director of Enterprise Sales Solution Development - Intel


Matthew Rosenquist’s cybersecurity industry outlook presentation to business executives and distinguished government guests at the Cyberstrat14 security conference in Helsinki Finland.

Listen to Matthew, as he discusses the importance of leadership, long term trends in the industry, and how security is an evolving problem with no easy solution.  Certain fundamentals ring true in managing cybersecurity risks.  The keys to success and sustainability include communicating and leveraging peers and experts across the industry, setting the right goals for optimal risk, implementing best processes and practices, and leading with clear purpose.

Cybersecurity is poised for a notorious year. The next 12 to 18 months will see greater, bolder, and more complex attacks emerge. This year’s installment for the top computer security predictions highlights how the threats are advancing, outpacing defenders, and the landscape is becoming more professional and organized. Although the view of our cybersecurity future is obscured, one thing is for certain: We’re in for an exciting ride.


In Evolution of Cybersecurity in 2015 Part 1, I discussed my first five predictions.  In this post, I will round out the next 5 to complete my top 10 predictions.

Top Predictions Continued:


6. Enterprise risk perspectives change


Enterprises will overhaul how they view risks. Serious board-level discussions will be commonplace, with a focus on awareness and responsibility. More attention will be paid to the security of products and services, with the protection of privacy and customer data beginning to supersede “system availability” priorities. Enterprise leaders will adapt their perspectives to focus more attention on security as a critical aspect of sustainable business practices.


7. Security competency and attacker innovation increase


The security and attacker communities will make significant strides forward this year. Attackers will continue to maintain the initiative and succeed with many different types of attacks against large targets. Cybercrime will grow quickly in 2015, outpacing defenses and spurring smarter security practices across the community. Security industry innovation will advance as the next wave of investments emerge and begin to gain traction in protecting data centers, clouds, and the ability to identify attackers.



8. Malware increases and evolves


Malware numbers will continue to skyrocket, increase in complexity, and expand more heavily beyond traditional PC devices. Malicious software will continue to swell at a relentless pace, averaging over 50 percent year-over-year growth. The rapid proliferation and rising complexity of malware will create significant problems for the security industry. The misuse of stolen certificates will compound the problems, and the success of ransomware will only reinforce more development by criminals.



9. Attacks follow technology growth


Attackers move into new opportunities as technology broadens to include more users, devices, data, and evolving supporting infrastructures. As expansion occurs, there is a normal lag for the development and inclusion of security. This creates a window of opportunity. Where the value of data, systems, and services increases, threats surely follow. Online services, phones, the IoT, and cryptocurrency are being heavily targeted.



10. Cybersecurity attacks evolve into something ugly


Cybersecurity is constantly changing and the attacks we see today will be superseded by more serious incursions in the future. We will witness the next big step in 2015, with attacks expanding from denial-of-service and data theft activities to include more sophisticated campaigns of monitoring and manipulation. The ability to maliciously alter transactions from the inside is highly coveted by attackers.


Welcome to the next evolution of security headaches.

I predict 2015 to be an extraordinary year in cybersecurity. Attackers will seek great profit and power, while defenders will strive for stability and confidence. In the middle will be a vicious knife fight between aggressors and security professionals. Overall, the world will take security more seriously and begin to act in more strategic ways. The intentional and deliberate protection of our digital assets, reputation, and capabilities will become a regular part of life and business.


To see the first 5 predictions click here: Evolution of Cybersecurity in 2015 Part 1


If you’d like to check out my video series surrounding my predictions, you can find more here.


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

The 2015 Verizon Data Investigation Breach Report (DBIR) is out.  First, a well-deserved recognition for the great job, yet again, done by this team.  This report is legendary in the security community for its relevant information, unbiased conclusions, and sheer entertainment value (fairly certain the team dresses as Monty Python characters at least once a year).


This year’s installment of the DIBR continues the tradition of outlining a number of trends, significant attack methods, and for the first year really digs into providing real data around one of the most nebulous measurement problems in the industry, data breach cost impacts. 

2015 DBIR industries.jpg

Here is my take on the top 10 things every security professional needs to know and drill down in the report:

  1. Everyone is a target, but Public (ie. government, etc.), Information, and Financial industries are at the center of the vortex for data breaches.  For organizations in those industries, be aware and take extra precautions in protecting the confidentiality of your data
  2. Misery loves company.  70% of incidents, where motive was determined, have a 2nd victim.  Two-step attacks reach outward to impact others and 75% of secondary victims are impacted within 24 hours.  Being compromised puts at risk those partners, customers, and suppliers who trust you 
  3. Attacks move fast.  60% of the time, attackers compromise a target within minutes
  4. Phishing success is up, rising from the teen percentages last year to 23% effectiveness and it happens fast, with 50% opened/clicked-on within 1 hour
  5. Vulnerabilities seem to live forever.  99.9% of the vulnerabilities exploited by attackers had existed for at least a year.  Attackers continue to follow the path-of-least resistance, targeting easy victims with well-known and documented vulnerabilities.  So, patch already! Exploits of Vulns.jpg
  6. The speed of exploit development is also rapid.  About half of all vulnerabilities were exploited in a month of being discovered.  So, apply patches early as well!
  7. Mobile device malware does not yet play a significant role in data breach attacks.  Only 3 of every 10k phones get infected with serious malware per week.  95% of malware types lived for less than a month, with most gone in a week.
  8. Malware is a unique beast.  70-90% of malware samples are unique to an organization, with distinctive signatures/hashes.  This is likely due to how modern malware continually morphs itself once inside, to remain entrenched.  It shows the old method of anti-malware pattern matching is largely ineffective.  This is why, for years, security companies have been shifting to better ways of detecting malware.
  9. For the first year, the DIBR has taken on the security metrics bull by-the-horns and produced analysis for impacts related to breaches.  It is one of the best models I have seen to calculate the elusive “cost per record”, and no, it is not a flat rate (not 58 cents and not $201 per record).  The logarithmic model Verizon came up with varies based upon the number of records exposed.  Losing 1k records would cost $67 dollars p/record, while losing 100m records comes in at about 9 cents p/record.  Scalable and rational, well done!
  10. The 10th and final conclusion I make is simple.  Go read it. The details are numerous, conclusions are solid, observations are real (not survey data), and the team provides the clues necessary for individual interpretation of relevance based upon your organization.  This is one report you should actually take the time to read.


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

Cybersecurity is poised for a notorious year. The next 12 to 18 months will see greater, bolder, and more complex attacks emerge. This year’s installment for the top computer security predictions highlights how the threats are advancing, outpacing defenders, and the landscape is becoming more professional and organized. Although the view of our cybersecurity future is obscured, one thing is for certain: We’re in for an exciting ride.


In this blog I’ll discuss my first five predictions; click here to see the next 5 predictions: Evolution of Cybersecurity in 2015 Part 2


Top Predictions:


1. Cyber warfare becomes legitimate


Governments will leverage their professional cyber warfare assets as a recognized and accepted tool for governmental policy. For many years governments have been investing in cyber warfare capabilities, and these resources will begin to pay dividends.

2. Active government intervention


Governments will be more actively involved in responding to major hacking events affecting their citizens. Expect government response and reprisals to foreign nation-state attacks, which ordinary business enterprises are not in a position to act or counter. This is a shift in policy, both timely and necessary to protect how the public enjoys life under the protection of a common defense.



3. Security talent in demand


The demand for security professionals is at an all-time high, but the workforce pool is largely barren of qualified candidates. The best talent has been scooped up. A lack of security workforce talent, especially in leadership roles, is a severe impediment to organizations in desperate need of building and staffing in-house teams. We will see many top-level security professionals jump between organizations, lured by better compensation packages. Academia will struggle to refill the talent supply in order to meet the demand.



4. High profile attacks continue


High-profile targets will continue to be victimized. As long as the return is high for attackers while the effort remains reasonable, they will continue to target prominent organizations. Nobody, regardless of how large, is immune. Expect high-profile companies, industries, government organizations, and people to fall victim to theft, hijacking, forgery, and impersonation.



5. Attacks get personal



We will witness an expansion in strategies in the next year, with attackers acting in ways that put individuals directly at risk. High profile individuals will be threatened with embarrassment, exposing sensitive healthcare, photos, online activities, and communication data. Everyday citizens will be targeted with malware on their devices to siphon bank information, steal crypto-currency, and to hold their data for ransom. For many people this year, it will feel like they are being specifically targeted for abuse.

To see the next 5 predictions click here: Evolution of Cybersecurity in 2015 Part 2


If you’d like to check out my video series surrounding my predictions, you can find more here.


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

The Role of Technology in Retail: Is it enough?

My friend and colleague, Jon Stine, @joncstine1 recently penned a blog regarding technology in retail. Jon has extensive retail industry and technology expertise and offers a great perspective on the role of technology to address challenges retailers face. The challenge from my perspective for retailers with store fronts, is best summed in a question. Can you deliver a killer shopping experience – from sofa-to-store aisle?” Sadly many retailers are not able to answer yes this question. The vast majority of the retailers don’t invest in innovative shopping solutions. Many retailers are content to follow the same old formula of price discounting, coupons and Sunday circulars. A well proven formula that is just too hard to break from. However, it is a formula we know no longer fits the new connected consumer.

The evolution of the connected consumer has been highlighted in popular press at great length for at least the last three plus years. However, many retailers have missed the evolution of the consumer. You know, the Millenials generation. It will comprise 75% of the workforce by 2020 and command over $2.5T in purchasing power. This segment is always connected and values experiences as much as price. And by the way, since they are always connected, they never shop without their device. Yes, the evolution has been occurring for some time and yes the Millenials are reshaping the shopping experience. What are you going to do about it?

Retailers are facing a strategic inflection point, which could mean an opportunity to prosper or a slow ride toward demise. At least that is my point-of-view. Jon is arguing a few factors that are relevant to creating an innovative shopping experience.


  1. “Showrooming” is multidirectional (in-store and online) and it is here to stay.
  2. Leveraging big data can have a profound impact on your brand and the experience you deliver – it should be considered as the starting point as you create a new shoppers journey.
  3. Security must become a strategic imperative for the way you conduct business – trust is won in drips and lost in buckets. Cybercriminals are well funded and profitable and hacking will continue as the new normal.

As mentioned in Jon’s blog, retailers have long chosen to focus on maintaining their ongoing operations, rather than investing for growth and innovation. Growth and Innovation don’t come cheap. As a matter of fact, growth and innovation are more than a technology roadmap. It is a business strategy. Why do consumers flock to Amazon or any of the “new concept” stores? I argue it boils down to the experience. Amazon provides the ultimate in clienteling and sales assist. The new concept stores I had the privilege to tour in NYC offer innovative shopping experiences. My store tour was during NRF 2015.

The collection of stores we visited all offered a unique & engaging shopping experiences.

Rebecca Minkoff – connected fashion store with technology envisioned and planed by EBAY. Bringing together the online experience to the physical store. In store interactive shopping display. Once the shopper selects the clothes they want to try on they tap a button to have a sales associate bring the items to a dressing room.

Under Armour Brand House – to create a physical space that becomes a destination for shoppers. The strategy for the stores is more about telling a story and engaging the shopper through story telling. UA founder Kevin Plank is more interested in aligning its product communication and retail presentation than anything else. His claim is that UA focuses 80% on storytelling and 20% on product – just the opposite of so many other product retailers.

Converse – yup that old classic, Chuck Taylor canvas shoe. Converse has been offering online customization for some time. But what if you wanted an immediate and unique shoe to wear to an event. Now you can visit a Converse store, select your favorite Chuck’s and then set off in creating your own personalized style.

Similar to the way Amazon offers a unique shopping experience these stores invested in delivering an innovation. It wasn’t a technology solution alone – it was a desire from top to bottom to give the shopper something unique and innovative.

Do you want help in delivering growth and innovation in your retail environment? Intel isn’t going to solve all of this on its own. We work with very talented fellow travelers that offer solutions to achieve growth and innovation.

Graylish, Gordon, Edgecombe, Paul J, Steuart, Ann M, Walsh, Megan A, Emde, Charles, Yep, Ray S, Snyder, Don P, Martin, Lisa A,Malloy, Steve,, Julie, Cavallo, Jerry,Phillips, Todd,, Dastghaib, Hooman, Gledhill, Alexander N, Karolkowski, Gilles,Aillerie, Yves,Horsthemke, Uwe, Calandra, Joseph, Vandenplas, Patricia, , Gary,Shean, Robyn, Bakkeren, Matty,Butcher, Paul,Brown, Steve PowerYep, Ray S, Bhasin, Rahoul, Dastghaib, Hooman , Cangro-essary-coons, Lisa, Archer, Darin,Robason, Kelly,Pitarresi, Joe, Nickles, Annabel SDastghaib, Hooman, Peutin, Florence  @Ward, Matthew, @Williams, La Tiffaney, @Fox, Tania, @Lester, Ryan, @Weiskus, Sarah,, Mushahwar, Rachel K, Poulin, Shannon, @Tea, Peter, @Webb, Victor; Laura Barbaro -,   Pattie.Sims@intel.com,

Lock-Security.jpgClouding Around - A mini-blog series on the Cloud with Arif Mohamed

Part 1: 8 Ways to Secure Your Cloud Infrastructure


Cloud security remains a top concern for businesses. Fortunately, today’s data center managers have an arsenal of weapons at their disposal to secure their private cloud infrastructure.

Here are eight things you can use to secure your private cloud.


1. AES-NI Data Encryption

End-to-end encryption can be transformational for the private cloud, securing data at all levels through enterprise-class encryption. The latest Intel processors feature Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), a set of new instructions that enhance performance by speeding up the execution of encryption algorithms.


The instructions are built into Intel® Xeon server processors as well as client platforms including mobile devices.


When encryption software utilises them, the AES-NI instructions dramatically accelerate encryption and decryption – by up to 10 times compared with software-only AES.


This speedy encryption means that it is possible to incorporate encryption across the data centre without significantly impacting infrastructure performance.


2. Security Protocols

By incorporating a range of security protocols and secure connections, you will build a more secure private cloud.


As well as encrypting data, clouds can also use cryptographic protocols to secure browser access to the customer portal, and to transfer encrypted data.


For example, Transport Layer Security (TLS) and Secure Sockets Layer (SSL) protocols are used to assure safe communications over networks, including the Internet. Both of these are widely used for application such as secure web browsing, through HTTPS, as well as email, IM and VoIP.


They are also critical for cloud computing, enabling applications to communicate over the network and throughout the cloud while preventing undetected tampering that modifies content, or eavesdropping on content as it’s transferred.


3. OpenSSL, RSAX and Function Stitching

Intel works closely with OpenSSL, a popular open source multiplatform security library. OpenSSL is FIPS 140-2 certified: a computer security standard developed by the National Institute of Standards and Technology Cryptographic Module Validation Program.


It can be used to secure web transactions through services such as Gmail, e-commerce platforms and Facebook, to safeguard connections on Intel architecture.


Two functions of OpenSSL, that Intel has contributed to, are RSAX and function stitching.


The first is a unique implementation of the popular RSA 1024-bit algorithm, and produces significantly better performance than previous OpenSSL implementations. RSAX can accelerate the time it takes to initiate an SSL session – up to 1.5 times. This provides a better user experience and increases the number of simultaneous sessions your server can handle.


As for function stitching: bulk data buffers use two algorithms for encryption and authentication, but rather than encrypting and authenticating data serially, function stitching interleaves instructions from these two algorithms. By executing them simultaneously, it improves the utilisation of execution resources and boosts performance.


Function stitching can result in up to 4.8 times performance improvement for secure web servers when combined with RSAX and Intel AES-NI.


4. Data Loss Prevention (DLP)

Data protection is rooted in the encryption and secure transfer of data. Data loss prevention (DLP) is a complementary approach focused on detecting and preventing the leakage of sensitive information, either by malicious intent or inadvertent mistake.


DLP solutions can profile content against rules and capture violations or index and analyse data to develop new rules. IT can establish policies that govern how data is used in the organisation and by whom. By doing this they can clarify security practices, identify potential fraud and avert accidental or unauthorised malicious transfer of information.


An example of this technology is McAfee Total Protection for Data Loss Prevention. This software can be used to support an organisation’s governance policies.


5. Authentication

Protecting your platform begins with managing the users who access your cloud. This is a large undertaking because of the array of external and internal applications, and the continual churn of employees.

Ideally, authentication is strengthened by routing it in hardware. With Intel Identity Protection Technology (Intel IPT), Intel has built tamper-resistant, two-factor authentication directly into PCs based on third-generation Intel core vPro processors, as well as Ultrabook devices.


Intel IPT offers token generation built into the hardware, eliminating the need for a separate physical token. Third-party software applications work in tandem with the hardware, strengthening the authentication process.


Through Intel IPT technology, businesses can secure their access points by using one-time passwords or public key infrastructure.


6. API-level Controls

Another way in which you can secure your cloud infrastructure is by enforcingAPI-level controls. The API gateway layer is where security policy enforcement and cloud service orchestration and integration take place. An increased need to expose application services to third parties, and mobile applications is driving the need for controlled, compliant application service governance.


WithAPI-level controls, you gain a measure of protection for your departmental and edge system infrastructure, and reduce the risk of content-born attacks on applications.


Intel Expressway Service Gateway is an example of a scalable software appliance that provides enforcement points and authenticates API requests against existing enterprise identity and access management system.


7. Trusted Servers and Compute Pools

Because of cloud computing’s reliance on virtualisation, it is essential to establish trust in the cloud. This can be achieved by creating trusted servers and compute pools. Intel Trusted Execution Technology (TXT) builds trust into each server, at the server level, by establishing a root of trust that helps assure system integrity within each system.


The technology checks hypervisor integrity at launch by measuring the code of the hypervisor and comparing it to a known good value. Launch can be blocked if the measurements do not match.


8. Secure Architecture Based on TXT

It’s possible to create a secure cloud architecture based on TXT technology, which is embedded in the hardware of Intel Xeon processor-based servers. Intel TXT works with the layers of the security stack to protect infrastructure, establish trust and verify adherence to security standards.


As mentioned, it works with the hypervisor layer, and also the cloud orchestration layer, the security policy management layer and the Security Information and Event Management (SIEM), and Governance, Risk Management and Compliance (GRC) layer.



Cloud security has come a long way. It’s now possible, through the variety of tools and technologies outlined above, to adequately secure both your data and your user. In so doing, you will establish security and trust in the cloud and gain from the agility, efficiency and cost savings that cloud computing brings.


- Arif

1Padlock.jpgSecurity is not relevant, until it fails.  This is the basis for many of the recurring cycles we have seen in cybersecurity.  New technology rushed to market is easily compromised by attackers, resulting in impacts that drive the demand for security, and the bolt-on solutions begin to emerge.  It is becoming evident this trend is not sustainable with the flood of more devices and significant growth of attackers capabilities.  The bad guys have a growing advantage.  It is time for the industry to change.  Bruce Schneier reinforces the point in the CIO article “Schneier on ‘really bad’ IoT security: ‘It’s going to come crashing down’” about IoT security.


Although the problem is not limited to the Internet of Things, the IoT revolution promises a plethora of devices to integrate within our lives and in the process collect data, providing recommendations, extending what we can control, and serving up meaningful information right when it is needed.  But these devices, just as the familiar computers we use everyday, are subject to vulnerabilities. 


Hackers and responding faster at compromising new software, operating systems, and even hardware.  The trend will become more prevalent and expand beyond heavy compute platforms to also include the smaller IoT devices, wearables, home automation, industrial controls, and vehicle technology which will proliferate in the coming years.  The development of tools, practices, and best-known-methods for vulnerability inspection for these new use-cases will accelerate, allowing for attacks to occur faster and deeper into the stack. 


We all play a role in how this cycle will unfold.  Standards bodies can choose to institute strong security controls to establish a strong defensive baseline or they can default to lower barriers of entry to encourage rapid adoption.  Device manufacturers and solution providers can choose to implement robust quality and testing as part of a secure design life-cycle or choose to cut corners in order get-to-market faster.  Security firms can be proactive in developing solutions which anticipate attacker’s likely maneuvers or play it safe and wait for impacts to drive the demand from customers.  Consumers can vote with their purchases to require good security of products or blindly buy without concern.  We all have a role and can influence how the history of emerging technology will be written.  Where do you stand?


For more of my rant, watch my Rethinking Cybersecurity Strategy video at the CTO Forum event where I challenge the top minds in technology to consider their responsibility and what is needed to change course to a more secure future.


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

Ready or Not, Cross-Channel Shopping Is Here to Stay


Of all the marketplace transitions that have swept through the developed world's retail industry over the last five to seven years, the most important is the behavioral shift to cross-channel shopping.


The story is told in these three data points1:


  1. 60 plus percent of U.S. shoppers (and a higher number in the U.K.) regularly begin their shopping journey online.
  2. Online ratings and reviews have the greatest impact on shopper purchasing decisions, above friends and family, and have four to five times greater impact than store associates.
  3. Nearly 90 percent of all retail revenue is carried out in the store.


Retail today is face-to-face with a shopper who’s squarely at the intersection of e-commerce, an ever-present smartphone, and an always-on connection to the Internet.


Few retailers are blind to the big behavioral shift. Most brands are responding with strategic omni-channel investments that seek to erase legacy channel lines between customer databases, inventories, vendor lists, and promotions.



Channel-centric organizations are being trimmed, scrubbed, or reshaped. There’s even a willingness — at least among some far-sighted brands — to deal head-on with the thorny challenge of revenue recognition.


All good. All necessary.



Redefining the Retail Space


But, as far as I can tell, only a handful of leaders are asking the deeper question: what, exactly, is the new definition of the store?


What is the definition of the store when the front door to the brand is increasingly online?


What is the definition of the store when shoppers know more than the associates, and when the answer to the question of how and why becomes — at the point of purchase — more important than what and how much?


What is the definition of the store beyond digital? Or of a mash-up of the virtual and physical?


What is the definition — not of brick-and-mortar and shelves and aisles and four-ways and displays — but of differentiating value delivery?


This is a topic we’re now exploring through whiteboard sessions and analyst and advisor discussions. We’re hard at work reviewing the crucial capabilities that will drive the 2018 cross-brand architecture.


Stay tuned. I’ll be sharing my hypotheses (and findings) as I forge ahead.



Jon Stine
Global Director, Retail Sales

Intel Corporation


This is the second installment of the Tech in Retail series.

Click here to view: blog #3

To view more posts within the series click here: Tech & Finance Series


1 National Retail Federation. “2015 National Retail Federation Data.” 06 January 2015.

Over the last several years, Intel IT has been implementing the Information Technology Infrastructure Library (ITIL) framework to transform our service delivery and enable us to align more effectively with the strategies and priorities of each of Intel’s lines of business (LOBs). In doing so, we can focus on high-priority activities that may potentially transform Intel’s entire business and boost the relevancy of IT. As the Chief of Staff for Product Development IT and the Director of Business Solutions Integration for Intel IT, I’m looking forward to meeting with others who have found the same value in using this practice or are considering starting that journey.



Intel IT at the Forefront of Business Relationship Management


From the top down, Intel IT fully understands the importance of business relationship management. In the last 18 months, we have transitioned from an organization loosely coupled to the business to one directly aligned with the business, literally sitting at the table to help make key business decisions.


To survive today, organizations must be adept at making effective use of information technology (IT) to support business operations and administration. Only a few, however, truly innovate business products, services, processes, and business models, even though today’s technology landscape offers a host of innovation enablers.

—Vaughan Merlyn, co-founder of the Business Relationship Management Institute

In 2013, Intel’s CIO, Kim Stevenson, personally asked each LOB to include an IT general manager (GM) on their staff. This suggestion was met favorably by the LOBs, who saw tremendous value in connecting more formally and more closely with IT.


Intel IT has adopted a user-centered approach to delivering IT services that enables us to optimize our IT solutions, improve employee productivity, and increase business velocity. Our user-centered approach involves proactively engaging and partnering with Intel employees and business groups to learn about their needs for information, technology, and services, as well as desired experience. ITIL has been integral in placing the customer at the center, and our new Business Solutions Integration (BSI) service aligns with our user-centered IT strategy. It integrates business relationship management and business demand management, presenting the LOBs with a “One IT” view. Each LOB has a dedicated IT LOB GM, along with other dedicated IT staff that form that LOB’s core IT team: a business relationship manager, a principal engineer, and a finance controller.


The day I’m representing Intel’s LOB more than my day job, I’ve arrived. 

    —Intel IT Staff Member

With a single point of contact for IT, the LOBs can more easily request services. But more important, IT is attuned to the LOB’s strategies, priorities, and pain points. We’ve slashed the time it takes us to say “yes” or “no” to a business request from an average of 36 hours to 8 hours, and our level of support has improved dramatically, according to annual Partnership Excellence surveys.




Run, Grow, Transform


IT used to be thought of as the organization that kept the lights on and the business running, building tools when necessary. But here at Intel, while Intel IT does indeed keep the business running, our best value lies in proactively collaborating with our customers. Therefore, instead of only focusing exclusively on “Run” activities (such as providing network connectivity), we also actively pursue “Grow” and “Transform” activities.


In the “Grow” category, for example, we conduct proofs of concept (PoCs) and enterprise early adoption tests for emerging technologies. Even more valuable are our “Transform” activities, where we are directly involved in co-creating marketable products with our product groups and providing Intel with competitive advantage.

Our BSI service incorporates these higher-value activities through its integration with the IT2Intel program. I’ll explore each of these activities in more detail in future blogs. But briefly, our IT2Intel program enables us to accelerate Intel's growth in enterprise markets by leveraging Intel IT's expertise in partnership with Intel product groups.




Shifting with the Business



Our close alignment with Intel’s lines of business (LOBs) helps us shift our priorities to meet the growing demand from the Internet of Things Group (IoTG).http://brminstitute.org/

As an example of how our direct involvement with Intel’s LOBs shapes our work, consider the following graphic that shows the distribution of business requests from the various LOBs. In 2013, Intel’s Internet of Things Group (IoTG), represented by the dark blue block at the top of the left-hand graph, had very few requests for IT. But in 2014, the number of IoTG business requests grew significantly. Because we have a seat at the table, we were able to evolve with the business and meet the demands of this burgeoning sector of Intel’s market.


Through our close communication with the IoTG and early PoCs, we’ve deployed infrastructure based on the Intel® IoT Platform. We are leveraging that experience to help the group deliver solutions to Intel customers. This is just one example of how, through our BSI service, IT stays relevant and valuable to the entire enterprise.

I encourage you connect with me on the IT Peer Network and on Twitter @azmikephillips to share your thoughts and experiences relating to IT business relationship management and how it can metamorphose the role of IT from transactional to transformational.

OEMs and other customers use Intel’s system-on-a-chip (SoC) products in their mobile devices. Intel makes a variety of SoCs, and any one SoC includes many components, with processor, memory controller, graphics, and sound integrated on a single chip. Each of these components comes with its own documentation, and there’s even more documentation that describes how to integrate these components with other custom components designed by the OEM. Pretty soon, you have tens of thousands of pages of documentation.


But each Intel customer needs only a fraction of the total available documentation -- a piece here and a piece there. They don’t want to read a 20,000-page document to find the three paragraphs they need.


Intel IT recently partnered with the Intel product group that helps Intel customers with mobile device design, to improve the delivery of content to customers.

Enter Stage Right: Topic-Based Content

Which would you rather use: a 500-page cookbook with general headings like “stove-top cooking” and “oven recipes,” or one with tabs for breakfast, lunch, and dinner, and cross-references and indexes that help you find casseroles, breads, stir frys, and crockpot recipes, as well as recipes that use a particular ingredient such as sour cream or eggs? Clearly, the latter would be easier to use because you can quickly find the recipes (topics) that interest you.


This-vs-that.pngDarwin Information Typing Architecture, known as DITA (pronounced dit-uh), is an XML-based publishing standard defined and maintained by the OASIS DITA Technical Committee. DITA can help structure, develop, manage, and publish content, making it easier to find relevant information.


Four basic concepts underlie the DITA framework:

  • Topics. A topic is the basic content unit of DITA, defined as a unit of information that can be understood in isolation and used in multiple contexts. Topics address a single subject and are short and standardized to include defined elements, such as name, title, information type, and expected results.
  • DITA maps. DITA maps identify the products a topic is associated with and the target audience. All these things help determine which topics are included in search results. DITA maps also include navigational information, such as tables of contents.
  • Output formats. DITA-based content can be delivered in various formats, such as web, email, mobile, or print. For ease of use, the content’s final design and layout—its presentation—varies to accommodate the unique characteristics of each output format.
  • Dynamic content. Customers can select and combine different topics to create their own custom documents, which is sort of like being able to replace one piece of a DNA map to create a brand new animal.

(If  DITA intrigues you, consider attending the 2015 Content Management Strategies/DITA North America conference in Chicago, April 20–22).

Intel’s Mobile Design Center Leverages DITA to Improve Our Customer’s User Experience

We designed a solution that eliminates the need for the previous long-form documentation. Instead, the solution enables SoC customers to assemble relevant content based on topics of interest. To achieve this, the Client Computing Group changed its documentation structure to topic-based content so that customers can quickly find highly specific information, enabling faster time to market for their mobile solutions and reducing the amount of time Intel engineers must spend helping customers find the information they need. The content is tagged with metadata so that customers can search on specific topics and bundle those topics into custom binders that they can reference or print as needed.


CustomerSatisfaction.pngThe Intel Mobile Design Center portal is described in detail in our paper, “Optimizing Mobile-Device Design with Targeted Content.” The portal’s ease of use contributed significantly to overall customer satisfaction with the solution. According to a survey we conducted, customer satisfaction scores have increased from 69 percent before implementation to 80 percent after.

Based on what the mobile communications group created in the Mobile Design Center, other groups are taking notice and creating their own design centers. For example, the Service Provider Division  have committed to creating its own design center and are delivering all of its content in DITA to provide an even more interactive design for their customers.

Getting from Here to There

Converting existing FrameMaker and Word documents to DITA was not an easy undertaking. For the mobile communications group, some content wasn’t converted due to lack of time, although the group has committed to using DITA for all new content. This group performed the conversion manually, taking about 5 to 10 pages per hour. The entire conversion project took months.


For the second group we worked with, who converted their entire documentation set, the conversion was accomplished using several methods. For large FrameMaker docs, they used a third-party product to partially automate the conversion process. While the resulting DITA docs still needed manual touch-up, the automated conversion was a time-saver. For smaller FrameMaker documents, topics were created manually. For Word docs, topics were manually cut and pasted.


So, was the effort worth it? Both groups agree that indeed it was. First, conversion to DITA revealed that there was a lot of duplication between documents. When in the DITA format, revisions to a topic only take place in that topic -- there is no need to search for every document that contains that topic. Not only does this reduce the time it takes to make revisions, but it also improves the quality of our documentation. In the past, without DITA, some documentation might be out-of-date because a topic was revised in one place but not in another.


“By converting to DITA we reduced the amount of content, allowing for reuse. This also reduced the amount of work for the authors,” said one team member. “DITA gives you a better feel of the makeup of your content,” said another.


Other team members touted improved revisions and version control and the ability to tag content by more than just document name.

What’s Next for DITA at Intel?

Because the solution we created is scalable, we anticipate that additional product and business groups across Intel will begin to take advantage of topic-based content to improve customer experience and Intel’s efficiency.


I’d love to hear how other enterprises are putting DITA to work for their customers, increasing customer satisfaction, encouraging dynamic content creation, and accelerating the pace of business. Feel free to share your comments and join the conversation at the IT Peer Network.

Business analytics and data insights empower today’s business leaders for faster decision making. A recent data consolidation and analytics project uplifted Intel’s revenue by $264 million in 2014, as highlighted in our recently published Annual Business Review. This $264 million represents only a portion of the $351 million in value generated by Intel IT through the use of big data, business intelligence, and analytic tools. Access to connected data in an efficient and timely manner has enabled stakeholders to analyze market trends and make faster and better business decisions.





The Right Data at the Right Time

Intel’s business processes use a significant amount of historical data to reach decisions. But isolated datasets are not very useful because they provide only a glimpse of a much larger picture. Recognizing the power of connected data, Intel IT engaged in an 18-month data cleansing and consolidation effort, connecting more than 200 GB of historical data from various disparate and vertical systems using common measures and dimensions.

The complexity of this project was daunting. There were many spreadsheets and applications, and even the same data had inconsistent identifiers in different datasets. Our efforts resulted in replacing more than 4,000 spreadsheets with a single database solution that included over 1,000 data measures and 12 dimensions, as well as tracking information for about 4 million production and engineering samples provided to customers.

Even connected data, however, is not inherently valuable, unless the data is conveyed in terms of trends and patterns that guide effective decision making. On top of our now-connected data, we added advanced analytics and data visualization capabilities that enable Intel’s decision makers to convert data into meaningful insights. About 9,000 application users that serve Intel and external customers have access to this data, along with 15,000 reporting users.


As part of the project, we automated our data management processes, so that we can now integrate new datasets in just a few hours, instead of in several months.



Boosting Sales with Reseller Market Insights

Another significant chunk of the previously mentioned $351 million -- $76 million -- was generated by a sales and marketing analytics engine that provides valuable information to Intel sales teams, helping them strategically focus their sales efforts to deliver greater revenue. The engine's recommendations identify which customers sales reps should contact and what they should talk to them about. This data significantly shortened the sales cycle and enabled sales reps to reach customers who were previously off the radar. (Watch a video about the analytics engine here.) The fact that this recommendation engine garnered Intel a 2014 CIO 100 award illustrates how important CIOs consider technology in today's business environment.



What’s Next for Data Visualization at Intel

Going forward, we intend to promote the collaborative analytics to Intel decision makers. For example, Intel IT has developed an Info Wall that harnesses the power of data visualization. This solution is built on Intel® architecture and is Intel’s first interactive video wall with a viewing area measuring 5 feet high and 15 feet wide. While it’s too early to state any specific results, this unique implementation will enable new possibilities for business intelligence and data visualization. Currently, the Info Wall and data focus on sales and marketing; we plan to soon expand the application of the Info Wall to other areas of Intel business.


In an age when organizations such as Intel are rich in data, finding value in this data lies in the ability to analyze it and efficiently derive actionable business intelligence. Intel IT will continue to invest in tools that can transform data into insights to help solve high-value business problems.


Cyber attackers and researchers continually evolve, explore, and push the boundaries of finding vulnerabilities.  Hacking hardware is the next step on that journey.  It is important for computing device makers and the IoT industry to understand they are now under the microscope and attackers are a relentless and unforgiving crowd.  Application and operating systems have taken the brunt of attacks and scrutiny over the years, but that may change as the world embraces new devices to enable and enrich our lives.

Vulnerabilities exist everywhere in the world’s technology landscape, but they are not equal and it can take greatly varying levels of effort, timing, luck, and resources to take advantage of them.  Attackers tend to follow the path-of-least-resistance in alignment with their pursuit of nefarious goals.  As security closes the easiest paths, attackers move on to the next available option.  It is a chess game. 

In the world of vulnerabilities there is a hierarchy, from easy to difficult to exploit and from trivial to severe in overall impact.  Technically, hacking data is easiest, followed by applications, operating systems, firmware, and finally hardware.  This is sometimes referred to as the ‘stack’ because it is how systems are architecturally layered. 
Attackers Move Down the Stack.jpg
The first three areas are software and are very portable and dynamic across systems, but subject to great scrutiny by most security controls.  Trojans are a classic example where data becomes modified with malicious payloads and can be easily distributed across networks.  Such manipulations are relatively exposed and easy to detect at many different points.  Applications can be maliciously written or infected to act in unintended ways, but pervasive anti-malware is designed to protect against such attacks and are constantly watchful.  Vulnerabilities in operating systems provide a means to hide from most security, open up a bounty of potential targets, and offer a much greater depth of control.  Knowing the risks, OS vendors are constantly identifying problems and sending a regular stream of patches to shore up weaknesses, limiting the viability of continued exploitation by threats.  It is not until we get to Firmware and Hardware, do most of the mature security controls drop away.   


The firmware and hardware, residing beneath the software layers, tends to be more rigid and represents a significantly greater challenge to compromise and scale attacks.  However, success at the lower levels means bypassing most detection and remediation security controls which live above, in the software.  Hacking hardware is very rare and intricate, but not impossible.  The level of difficulty tends to be a major deterrent while the ample opportunities and ease which exist in the software layers is more than enough to keep hackers comfortable in staying with easier exploits in pursuit of their objectives. 
Some attackers are moving down the stack.  They are the vanguard and blazing a path for others to follow.  Their efforts, processes, and tools will be refined and reused by others.  There are tradeoffs to attacks at any level.  The easy vulnerabilities in data and applications yield much less benefits for attackers in the way of remaining undetected, persistence after actions are taken against them, and the overall level of control they can gain.  Most security products, patches, and services have been created to detect, prevent, and evict software based attacks.  They are insufficient at dealing with hardware or firmware compromises.  Due to the difficulty and lack of obvious success, most vulnerability research doesn’t explore much in the firmware and hardware space.  This is changing.  It is only natural, attackers will seek to maneuver where security is not pervasive.


As investments in offensive cyber capabilities from nations, organized crime syndicates, and elite hackers-for-hire continue to grow, new areas such as IoT hardware, firmware, and embedded OS vulnerabilities will be explored and exploited.

Researchers targeting hardware are breaking new ground which others will follow, eventually leading to broad research in hardware vulnerabilities across computing products which influence our daily lives.  This in turn will spur security to evolve in order to meet the new risks.  So the chess game will continue.  Hardware and firmware hacking is part of the natural evolution of cybersecurity and therefore a part of our future we must eventually deal with.


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist



Embrace secure retail.  Embrace mobile point-of-sale:


As we are all aware this October the standards for securing retail transactions will shift to EMV in the US. The standard will define a new era in credit card purchases by making them more secure.  With the rollout of any new standard there are bound to be opportunities for suppliers to seize the moment and deliver innovation.  The new Panasonic FZ-R1 mobile point-of-sale (mPOS) 7” tablet includes EMV reader for securing payments. The advantages of the FZ-R1 stems not only from its ability to be used in-hand or in-cradle for the countertop, to its rugged design, the encryption magnetic stripe reader and the Near Field Communication (NFC) payments acceptance for Google Wallet and Apple Pay.  In my opinion that is innovation.  Anytime a company can make its products easier for adoption it gives them an advantage.  With the transition to EMV, the liability shifts from credit card companies to retailers.


As a consumer, we desire to have a smooth shopping experience.  The winners will be those retailers that can offer the most seamless experience from sofa-to-store front.  As a result Retailers are re-evaluating their mPOS platforms.  In most cases technology that is thoroughly integrated, based on industry standards and offer a clear upgrade path typically win out.



hirorsuguminami jcstine ChrisPeters IT Peer Network

Filter Blog

By date:
By tag: