Trying to start off the new year with a question more than a statement as you can see from the subject.  I ask this because of some of the work I am currently doing.  Through the past several months we have been looking a several "influencing" factors and their possible effect on tomorrow's corporate environment.  Things such as consumerization, MID's, netbooks, bring your own computer and even the Generation Y workforce growing in size.  I think one area of "influence" we haven't looked at is legacy IT.  It is just as much an influence as new technologies and trends.  Many shops spend lots of money to put solutions, good or bad, in place.  Invest in infrastructure that made sense 3-5 years ago.  Set roadmaps that made sense when first proposed and established processes for how IT used to work or should have worked.  But the real question today is what would you do different?  Should we take a more agressive approach at End of Lifing pre-existing technologies and solutions that seem to cost more to support today or in some case are here to solve a problem that doesn't exist or has moved on somewhere else. What about out sourcing, how many jobs today no longer make sense from a corporate stand point?  Providing a service is one thing, but if you are providing the same service as the vendor at a higher cost, that really doesn't make sense.  I guess what I am really looking for is what is the value add?  What would you different and what is the value add you feel it would bring to your IT?

 

Just some food for thought to start the new year, I don't think there is a right or wrong answer, simply some space for some sipirted discussions

 

Please share your thoughts!

Live migration is an essential technology for an agile, dynamic data center environment based on server virtualization. Until now, it has not been possible, however, to perform successful live migration between servers based on different generations of processors, each with different instruction sets - this limited our ability to implement large resource pools, creating islands of servers and hindering the implementation of advanced data center capabilities.

 

Combined, Intel VT FlexMigration assist and Enhanced VMotion are designed to overcome this limitation by enabling all servers to expose the same instruction set to applications, even if they are based on different processor generations from a single CPU supplier.

 

Intel IT and Intel’s Digital Enterprise Group, End User Platform Integration, conducted proof-of-concept (PoC) testing of live migration using Intel® Virtualization Technology FlexMigration (Intel® VT FlexMigration) assist and the Enhanced VMotion feature of VMware ESX 3.5U2*. All migrations completed without problems and our testing demonstrated that we can use Intel VT FlexMigration assist for live migration of Intel IT business applications in a mixed production environment. As a result, we can create resource pools that combine servers from multiple generations, eliminating incompatible islands of servers and allowing full implementation of advanced data center capabilities. Accordingly, we expect to standardize on systems with Intel VT FlexMigration assist in the future.

 

Our recently published IT@Intel white paper:  Testing Live Migration with Intel® Virtualization Technology FlexMigration' documents the details pertaining to our tests – the types of systems tested, the workloads used, different scenarios examined and the results.

 

The paper can be downloaded at Testing Live Migration with Intel® Virtualization Technology FlexMigration

 

On behalf of our team, I’d like to invite you to view this whitepaper and comment about how you are using or intending to use these technologies in your datacenters and your experiences to-date with these capabilities.

Momentum continues to gather for the protection of people’s private data.  On January 28th, the US, Canada, and 27 European countries will celebrate Data Privacy Day.  The security aspects seem simple in principle, but are proving to be more challenging than anyone predicted.

 

Today we celebrate Privacy Day, to promote fundamental principles of privacy and to raise awareness in our society.  The advancement and adoption of everyday technology has pulled this issue into the attention of the world stage.  In recent years, consumers insatiable desire for convenience, efficiency, and speed have placed our identities, purchases, interests, medical records, debts, communications, and social interactions into the digital world.  Indeed, our very lives are being tracked, processed, stored, and transmitted electronically.

 

There is a cost to all the inherent benefits: our Privacy.  One of the most important liberties in our free and open society is our right to privacy.  Our ability to choose what others know about us grants individuals some semblance of control in how we can be manipulated by others.  Protecting our private data is key.

 

The realms of security and privacy are beginning to blur.  I see a trend of security organizations being asked to tackle this tricky problem.  On the surface, it appears to be straightforward.  Find the data and secure it.  However, the picture starts to get complicated when we consider regulations, security controls, data lifecycles, and the immense behavioral challenges.

 

Regulations

The European Union strongly influenced the direction back in the 1990’s with the development of privacy directives which outlined some basic principles.  Since, decentralized regulations have been germinating and beginning to take hold with different verbiage, requirements, and exemptions all over the world.  Even within each country, different regulations may exist for different states, provinces, or jurisdictions.  Today’s landscape is ever changing with overlapping policies, gaps, and regulations which touch different aspects.  It is a mess.  Well, Rome was not built in a day and neither will a unified privacy stance.  Security, with the goal of meeting all the regulations, must understand the requirements and make them magically come to fruition.

 

Security controls

The security controls, including tools, standards, and processes, are themselves new and trying to keep up with the changing types of data and how they are handled by organizations.  It is akin to herding cats.  Finding private data is tough enough, but securing it with a comprehensive strategy without impacting the business value of how it must be used is problematic.  To compound the problem, new technologies and more types of data are being added to the pool.  Everyone loves data. Nobody loves the job of securing it.

 

Data lifecycles

It is not enough to simply lock up data from prying eyes.  Data must be managed.  In some cases, the very person which the data represents must be given a chance to review and correct inaccurate data.  Information may be obtained only in certain ways, stored securely, accessed in a controlled manner, and most importantly, data must be destroyed.  Yes, destroyed.  Which means security must have a strong hand in how data is managed across its entire lifecycle.

 

Behavioral Challenges


Securing data may sound tough, but the most difficult problem is not technical in nature.  It is the behavioral challenges of educating people why security is necessary and to convince them it is in everyone’s best interest.  The toughest audience to convince are the end-users, especially the next generation who are just now leading the social media exploration of cyber communication and on-line communities.  They are willing to share very personal data without comprehending the risks or understanding how it may adversely affect their future.

 

 

Which brings us back to Data Privacy Day.  As an employee, I am proud Intel is actively participating in Privacy Day
http://www.intel.com/policy/dataprivacy.htm  Check out the event details, other participants, and resources!

 

 

Exerpt:

“Designed to raise awareness and generate discussion about data privacy practices and rights, Data Privacy Day activities in the United States have included privacy professionals, corporations, government officials, and representatives, academics, and students across the country.


One of the primary goals of Data Privacy Day is to promote privacy awareness and education among teens across the United States. Data Privacy Day also serves the important purpose of furthering international collaboration and cooperation around privacy issues.”

Hello again, it has been a very long time since my last external post. Sorry about that! I have plenty of excuses as to why, just none that are worthy of expressing. I was sitting down the other day, reviewing some of my industry RSS feeds, reviewing a few tweets for those I follow and spent time reflecting on my team's work in collaboration space for our internal Intel employees.

 

Industry experts, analysts and somethat say they are experts - point out their "right answers" to collaboration (which are blogs, wikis, social networks to name a few). Makes me stop and wonder, are those folks looking back at history of collaborative tools? Or are they focusing their energy on "the shiny new thing." Let's look back for a moment - do we think that collaborative tools are something new? (they really are not). Look at the past improvement attempts, like email. Here at Intel email is still the big collaborative tool. Would we say that was a success? If so, why improve it? It definitely has filled a gap for quite sometime. Many folks still use it - a lot (just go on vacation for a week and don't check your email to see how much). Some folks have move to Intelpedia (our internal wiki) for posting content. Intel's wiki use has taken off over the past 2 to 3 years. Maybe five years from now - we might look at wiki's in the same vein as email. What is next up? How will we feel about that one in ten years? We are being challenged to deliver new collaborative capabilities - which to me are solving the same set of problems that have been around for quite awhile (with a few new issues added).

 

While it's important to avoid locking ourselves in the past, or letting the past bias our view of current or emerging tools, it is extremely important not to forget the history of collaborative tools and the complex problems those tools attempted to address. The Web 2.0 vendors need to really look long and hard at those problems and use cases - rather than shining up something new (that meets some of needs). I come across challenges everyday when speaking to many Intel users and teams. When I attempt to get a better understanding of the problems that they are telling me, they point to a solution that they have seen. Some shined up version of something that could work, maybe.

 

Shiny objects always get someone's attention. We ran into a recent challenge around micro blogging at Intel. Many Intel folks are on Twitter (sbell09 for me) and this is great for external stuff. The questions comes down to, "Am I sharing something externally that I should not?" That question started internal use of Yammer - for the Intel group only. Grew to over 400 users. Many folks saw some value, others not but it all comes down to what you put in. A variation of the Twitter question was asked, "Is Intel IP secure?" Yammer is externally hosted. Someone pointed to why don't we just set it up internally within the firewall? That very weekend someone did just that.

 

We must not forget that these new technologies are not perfect. We must also not forget that the individual behavior changes that will come with these tools - is going to be a big change. That change must come with improvements to getting work done, quickly and securely.

 

What challenges do you face? Do you folks remember history? Do they care? How do we stay ahead?

Don't assume people will read the security policy!

 

Just because the policy is posted, does not mean everyone will read it.

 

 

Listen to the Audiocast:Information Security policy must be marketed to employees

 

Policy, like any other communication, must be marketed.  It is the role of the security professional to show the end-users the value and how it helps them.   Make it personal.

 

References: SANS.org blog: How to Suck at Information Security

Cloud Computing is getting a lot of press lately as a way to quickly add new computing capability and reduce costs.  Many enterprises, including Intel, are determing how best to extend to the cloud.

 

By "cloud" we are referring to a complex computer network, most often the internet.  Cloud computing consists of Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS).

 

The Intel strategy is to grow the cloud from the inside out.  We are taking advantage of SaaS and IaaS where possible and we are building a private, internal cloud (iCloud) computing environment.  Find out more by reading the following white paper:

 

Developing an Enterprise Cloud Computing Strategy

Can security be detrimental to an organization?  Absolutely!

 

Being aware security programs may become the source of losses and introduce more risk is important for establishing and maintaining a valuable security capability.

 

Listen to the audiocast

 

It is important to understand there does exist a dark side to information security.  If it is not professionally managed it can cause productivity impacts, financial losses, and introduce liability for the corporation.

After spending the last 6 months researching emerging technologies around the IT Client platform, I have identified two must have technologies when considering your client refresh.  The first is Solid State Hard disks.  While the cost is a concern at initial glance, the benefit you receive from this technology is incredible.  We have seen benefits such as no more hard drive failures do to failures from moving parts.  Increased performance from faster machine startup and resume times.  Increased application responsiveness from quicker access on a SSD versus traditional platform.  Fragmented hard drives become an issue of the past and you can now save costs on 3rd party defrag tools and/or custom solutions you develop in house.  These are just some of the many benefits we have seen, for more in depth review check out our recently released whitepaper - http://communities.intel.com/docs/DOC-2524. But beyond all of these benefits are the ones you may need in the future.  As IT moves to more and more of a Virtualized Client environment, technologies like these help make adoption much easier.  When testing the Solid State Disks, we noticed that our Virtualized IT environment running in a traditional Type-2 Client Hypervisor actually ran 27% faster than the same virtual environment on a traditional platter based drive.  This brings me to the next technology, VT-d.  This is the next evolution in client support of virtualization.  While todays more common systems have VT-x, VT-d is now available on many newer systems today.  VT-s offers what we refer to as "direct pass through" interface for virtual machines to communicate with the system hardware.  What this means for you is that you can have a virtualized OS that can talk directly to certain parts of your systems hardware without having to go through a virtualization layer in a Host OS.  This will also enable better use of Type-1 Hypervisors or "Native Client" hypervisors that will allow side by side, on at the same time OS operation on a single platform.  Imagine being able to support a corporate and personal build on the same machine but keeping them isolated from each other.  This opens the door to a host of possibilities for future IT shops.  Not all of these technologies are ready to run full speed today, but with most shops carrying a 2-3 year refresh cycle, it is important to buy the right technologies at the right time so when you want to deploy these, you have systems that support them.  So make sure you check these two technologies out and get them into your client roadmap as soon as possible.

Before I begin I just wanted to share that this is my first attempt at blogging and I’m really excited to try out this new medium (at least for me J).

 

My name is Gal Eylon, I’m a program manager within Intel IT and I am leading a team which is responsible for vPro adoption activities across our enterprise. Recently we have posted a white paper ( Implementing Intel(r) vPro(tm) Technology to Drive Down Client Management Costs ) that details the journey we have gone through in order to fully deploy vPro use cases within our production environment. The white paper walks you through our architecture and engineering phases and then takes a deep dive into the operational phase – which made use case deployment a reality for Intel.

 

Although our journey was not easy (and has only begun…) – we are pretty pleased from our results and hope you would benefit from this white paper and that it would ease your adoption activities within your environment. In addition - I would appreciate if you would share some of the experiences, BKMs and challenges you are facing within your enterprise. If you are looking for additional info regarding our adoption activities please let me know and I’ll be more than happy to share.

 

Happy New Year!
Gal.

OS Streaming can deliver considerable manageability benefits to Intel IT training rooms with multi-user PCs.  To evaluate performance and utilization in a production environment, we conducted proof of concept (PoC) testing in two rooms located in different buildings on an Intel campus.  We found that OS streaming improved manageability and delivered fast client boot times with moderate server and network utilization, even during worst-case boot storms.

 

See full paper posted at:

 

http://download.intel.com/it/pdf/Improving_Managemability_with_OS_Streaming.pdf

Filter Blog

By author:
By date:
By tag: