Skip navigation

Get social with Open Port

Posted by marmstrong Jun 30, 2009

With the popularity of social networking tools becoming more widely used, we've made the decision to expand the Open Port presence in an attempt to not only keep up with what's happening and where people are hanging out but also to connect the members of the Open Port community with each other in new and engaging ways.


That's why we've started a number of social media profiles to connect with interesting people and distribute Open Port content where they are, hopefully prompting them to come back here to Open Port and further engage in the conversation.


open-port-avatar.jpgTo that end you can follow us on Twitter, subscribe to our profile onFriendFeed, see what we find interesting on Delicious and become a fan of Open Port on Facebook.


We hope these profiles will be useful to all of you. Those sites and networks have a ton to offer in the way of two-way conversation, which is why we're using them.


If you're an Open Port community member and are on those networks we'd encourage you to share your profile in the discussion forums so that not only can we connect with you but so that other members of the community can find you there as well.




Mike (Open Port Community Admin)

Let me begin by way of introduction - I am a strategic financial analyst with Intel IT Finance organization focused on data center strategy and efficiency efforts.  This is my maiden voyage into the world of blogging, so I hope the topic is relevant and interesting to the audience.



Similar to many organizations, Intel IT is focused on constantly improving the cost of keeping the business running while not sacrificing the level of support required by customers.  With industry and technology solutions evolving at an increasing pace, choosing the most appropriate place and time to invest is paramount to driving down infrastructure costs.  Budget constraints in this economic climate and the make implementing efficiency efforts all the more daunting.



In 2008, Intel IT initiated a Design Server Refresh strategy where the basic premise was to leverage server performance improvements to respond to increasing compute requirements without growing data center capacity at a corresponding rate.  In 2008, we were able to remove 20,000 single core servers from our production environment, allowing us to realize approximately $45M savings through avoiding data center additions and server operating costs.  However, even with this strategy driving significant near term results, the 2009 operating environment forced us to pause and re-evaluate the merits of continuing execution to the strategy.



This re-evaluation concluded that this was an investment that couldn't be deferred due to the need for incremental growth and the high utilization of our existing data centers.  In addition, based on a average 10:1 consolidation, the refresh of single core servers would generate significant operating savings and clear more headroom than seen historically.  The details of this analysis are included in the White Paper:  Staying Committed to Server Refresh Reduces Cost



Questions for the readers: Do others have a refresh strategy or guideline? Are others seeing this type of impact/results and the challenges in implementation?



I wanted to take a minute to talk about solid state drives and the impact they have had on my mobile computing experience but we put together this video which does a better job than I can do in words alone.  Enjoy!








If there was one thing among many that annoyed Windows Vista users it was the User Account Control. Constant warning messages asking for permission to continue many tasks was no joy to any user trying to even basic tasks on their PC. The tweaking channels were soon inundated with requests for tips on how to stop it nagging you every time you wanted to do something. Fortunately UAC has been improved quite a bit in Windows 7 so that it isn’t quite as annoying as it was in Vista. You can, as ever, tweak it more if you like.


To get started navigate to the Control Panel, User Accounts and Family Safety. Click User Accounts, then Change User Account Control settings. From the next screen move the slider to select the level of protection you want.


Ensure you have a backup or restore Point on your machine before making any changes. If you follow these instructions to the letter you should have no problems, but we can’t be held responsible if things go wrong.


Here are the four levels, and what they mean:


Always notify on every system change. Works like Vista.  A nannying prompt pops up whenever you make changes to your system.

Notify me only when programs try to make changes to my computer. This is the default setting Make a change while logged in as an Administrator and it stays quiet. When a program makes a change, a prompt appears to check what’s going on.


Notify me only when programs try to make changes to my computer, without using the Secure Desktop. This setting is identical to the default setting, with one difference: It won’t dim your desktop so that you only see the UAC prompt asking you to take action. This presents a slightly higher security risk over the default settings, as a program could allow another malicious program or code to interfere with the UAC prompt.


Never notify. UAC is turned off. This is an insecure option and not recommended for most users. However if you have a good firewall and anti-virus, you can turn it off if you like.


After you choose your level, click OK.


You can also disable UAC with a registry hack if you have the skills.


Open the registry editor (regedit) and find;




And find the entry EnableLUA, and modify the value to 0 (zero). The find the ConsentPromptBehaviourAdmin value and change that to 0 (zero) too.

The next time you restart your machine the UAC will be turned off and you will never be bugged by it again.


As always with Windows, there is a downside and that is that you can no longer use Windows Gadgets and MS think that the system is too open to attack with UAC off and gadgets running in the background. I personally never used the gadgets anyway…


[post edited by Intel Admin due to a violation of terms and service]

I was recently trading thoughts with Anton Chuvakin, a respected security metrics professional, in a philosophical discussion of perfection and quality of security.  Admittedly, I was on auto-pilot (operating without the benefit of coffee) rattling away with my ‘Optimal Security’ rhetoric, when Anton posed two thought provoking questions: CAN one "mandate optimal security"?  How do you "mandate flexible"?


I was stopped in my tracks.  This got me thinking.  After fetching a tall cup of coffee to start my brain juices flowing in earnest, I reached back into the pages of history to come up with the following perspective and examples:


I believe, to a certain extent, we can mandate flexibility and optimization.  Surely we can act in ways which deny both.  So why can’t we act in a manner which intrinsically promotes them?


I think back to lessons of WWII and the Maginot line.  The French chose to create a fortification which was static by design and lacked mobility or a capability to adapt to changing enemy tactics.  They invested heavily into this control, which became the backbone of their country's eastern defense.  It was an appalling failure.  Alternatively, the German blitzkrieg, and the stratagems of both Rommel and Patton prevailed.  Flexibility through mobility was far more effective than an elaborate static defense.


I would argue that flexibility can be mandated through proper planning and design.  We have examples in the history of information security.  In the early years of Anti-Virus (AV) products, they were non-memory resident applications which were prescribed to be run once a week.  Updates were a rarity if at all.  That rigid design quickly lost effectiveness, with the rise in velocity of new malware.  AV vendors were forced to adapt.  The overall design has changed to one which is flexible, can be updated to meet emerging malware, and continuously runs in the background to provide persistent security.


Rigid security postures lack the ability to remain effective over time and are likely derived by an equally rigid infrastructure which will struggle to adapt to new threats and changes within the organization.  Create security to be flexible and you enable the service to keep up with the continual changes.


In general, design a system to be flexible and its longevity for effectiveness is extended.  Plan how systems can continuously adjust itself to align to what is 'optimal' and you increase the sustaining efficiency.


We must be strategic in our planning and design of security, lest we suffer the fate of France's Maginot line.


Check out Anton’s Blog for other thought provoking viewpoints; just be sure to have your coffee at the ready.

More on “Optimal security”:

Strategy for Sustaining Optimal Security

Information Security Defense In Depth Whitepaper is Now Available

Fortune Cookie Security Advice - June 2008

Defense In Depth Strategy Optimizes Security

The Four Dirty Questions of Measuring Information Security

What are your thoughts?  Rigid or Fluid?  Have you implemented optimal and flexible?

Think strategic.  Act competitive.  Be secure.


Everyone wants information security to be easy.  Wouldn’t it be nice if it were simple enough to fit snugly inside a fortune cookie?  Well, although I don’t try to promote such foolish nonsense, I do on occasion pass on readily digestible nuggets to reinforce security principles and get people thinking how security applies to their environment.


The key to fortune cookie advice is ‘common sense’ in the context of security.  It must be simple, succinct, and make sense to everyone, while conveying important security aspects.


Fortune Cookie advice for June, 2009:




Think strategic.  Act competitive.  Be secure.


Security is a sustaining commitment where long term planning provides a distinct advantage.  Threats are derived from intelligent adversaries.  Success requires maneuvering in a competitive manner to remain secure.





Fortune Cookie Security Advice - May 2008

Fortune Cookie Security Advice - June 2008

Fortune Cookie Security Advice - August 2008

Fortune Cookie Security Advice - September 2008

Fortune Cookie Security Advice - November 2008

Fortune Cookie Security Advice - December 2008

Fortune Cookie Security Advice - January 2009

Fortune Cookie Security Advice - February 2009

Fortune Cookie Security Advice - March 2009

Fortune Cookie Security Advice - April 2009

Fortune Cookie Security Advice - May 2009

Optimal security must not only be attained, but also sustained over time.  A good security strategy must be forward thinking to understand how intervention and continual maintenance will be needed, then implement those capabilities as part of a complete service deployment.



'Optimal Security' is the right balance of security spending and losses prevented where business acceptable losses are achieved.  It changes often and likely maintains different targets for the dissimilar parts of the entity.


Organizations are likely to mandate security expectations which typically manifests in a set of configurations, specifications, and operating standards.  The risk is these security controls may be relatively static and entrenched.


Establishing a baseline security is a good practice, but in order to remain effective it must adapt to changes in the environment by remaining dynamic to keep in lock-step with rapidly changing threats, vulnerabilities, and resulting exposures.  It must be a fluid posture, able to rapidly change based upon different internal priorities and external changes.  Sustaining business structure must be designed to continually predict areas needing modification and support design and deployment of those changes.  Rigid security postures lack the ability to remain effective over time and are likely derived by an equally rigid infrastructure which will struggle to adapt to new threats and changes within the organization.  Design security to be flexible and you enable the service to keep up with the continual changes in the information branch of security.


I recently spoke with an organization who had established a security posture which relied heavily on a hardened OS and application build for their systems.  At the time, they deployed a platform which took into consideration all the best configurations for hardening.  They were so confident they had satisfied security requirements they considered the problem solved.  They integrated the security design into their normal platform refresh cycle of system replacement every few years.  They never comprehended the fact they would need to continually update the build to compensate for changes in threats, new vulnerabilities and malware, and evolving business usage models.


The platform’s security, which initially was strong, began to quickly erode.  With no internal mechanism to identify when changes needed to be made, nor the testing and distribution capability, they soon found themselves in a situation where they were responding to individual incidents and changing systems one at a time based upon particular end-user needs.  This created inconsistencies in the builds which was more difficult to support.  Without proper forethought, the security team turned themselves into a firefighting organization, losing the initiative in the war of security.


This is one simple technical example.  The same holds true for the expanse of automated solutions and behavioral security controls as well.  Highly effective and efficient security strategies are forward thinking and understand how intervention and continual maintenance will be needed, then implement those capabilities as part of a complete service deployment.  Overall, the concept of ‘optimal security’ is one of fluid adaptations of controls to meet an ever changing target for risk acceptance.


Back in April I told you about a small proof of concept we were planning to measure energy use in the office environment and then use that established baseline to test different energy saving methods.  I thought it would be good to give you a quick status update on the work done to date.


The PoC is currently underway, and in fact, is nearing completion.  Like I mentioned in April, it is pretty small with just 12 users, but we hope the results will help direct what we might later try on a larger scale.


We started the PoC on Friday May 15th with meter loggers installed on 6 circuits monitoring energy use for the 12 users in the PoC every 3 minutes.  We ran the metering for 2 weeks before telling the PoC users to establish an uninfluenced baseline.


After setting the baseline, we split the 12 users in to 3 groups, each focusing on a different energy savings technique.


One group receives information on their energy use every 2 days showing how much energy they are using, what it is costing, and a few simple tips on how they might reduce their energy use.  Nothing is forced.  In this group, we are looking at how “Awareness” alone might change behavior.


The second group installed a 3rd party agent on their systems which allows us to enforce more restrictive energy management profiles than they might normally use.  The software also allows us to record time in state on each system, thereby providing a degree of “soft” individual system metering.


The last group had USB triggered power strips installed in their offices, connected to their docking stations, which automatically power off all devices in their offices that do not need to be on when they are not there.  We connected devices such as; task lighting, displays, and chargers to these strips.


We are in the last week of data collecting now, so stay tuned for some high-level results to be posted soon and possibility a full paper published later.


Please let me know if you have any questions or if you are doing or have done anything similar in your enterprise.



As a major global manufacturer Intel works constantly to improve its Supply Chain. Our ERP implementation and key projects are integral ingredients in the process of driving Supply Chain improvements. It was exciting that recently we saw Intel recognized as one of the top leading companies from a Supply Chain perspective. AMR Research published The AMR Research Supply Chain Top 25 for 2009.


Check out AMR’s Press release:

Intel uses the concept of corporate goals as a way to crystallize what is important across the company.  Every year the CEO and his staff agree on the big items Intel wants to achieve.  These are defined and grading is agreed on.  This is a great recognition tool in that it focuses all needed areas of the company to achieve these goals.


From an ERP perspective corporate goals have several advantages.  When running an ERP effort that is one of the corporate goals then it tends to be a lot easier to get support from matrix groups since all groups want to achieve and support the corporate goal.  Generally, groups tend to focus on their own goals (since not all groups have a corporate goal for their activities) but the corporate goals break down cross group barriers and trump group goals.  In years past, ERP in overall or individual programs were not part of the corporate goals.  When this was the case, ERP efforts could be categorized as being IT or business focused.  Items such as ERP upgrades, hardware upgrades, etc. tend to be IT focused.  On the other hand business efforts tend to focus on delivering new functionality (e.g. implement a new Advanced Planning module) that will enable some new element in the business (e.g. a new division or warehouse or improve delivery performance).  When an IT ERP program supports a business corporate goal, then that tends to be a powerful catalyst in terms of ensuring executive and senior management support, resources, and support from other groups.  But the ideal ERP program has both an IT corporate goal and a business corporate goal.  When these rare conditions exist then obstacles are removed as if by magic.  Here the business is extremely motivated as are all the groups needed in IT.  The downside is the amount of visibility and scrutiny tends to be extremely high.  But all in all the positives outweigh the negatives in this “Ideal” ERP scenario. 


Whether a corporate goal or not, I would argue that an essential ingredient in ensuring an ERP effort is successful is to ensure both the business and IT think it is a priority.  This may seem obvious but it is not uncommon for an IT department to pursue a major effort that is not necessarily aligned with business priorities.  When this happens, the risk of failure increases dramatically.  At Intel, IT can get a major program included as a corporate goal and this in turn ensures senior business management support.  Although very powerful by itself the effort becomes even more powerful when the same ERP effort is also a business corporate goal.  We have examples of this alignment and it creates a positive environment for ensuring visibility and results.

Filter Blog

By date: By tag: