My name is Soren Andersen.  I work in Intel’s Information Technology Supply Network Capability group as a manager delivering strategic Enterprise Resource Planning (ERP) programs and projects.  My goal is to blog about the challenges of delivering ERP in a large corporation such as Intel.  My intent is to focus on delivering ERP to enable the business.  Since I am from Intel you might expect me to blog about Intel products.  However, I will leave that to other company experts.  In this entry I will provide you some background on who I am and also provide some context and a framework for upcoming entries.

 

I have worked in the Information Technology/Information Systems field for over 20 years.  My degree is in Industrial Engineering.  The first 6 plus years I spent at Electronic Data Systems (EDS, which is now a part of Hewlett Packard) conducting systems integration on engineering and manufacturing systems in the Midwest.  I spent the latter part of my time there as a manager delivering imaging systems for government, manufacturing, and medical industries.  The environment was mainframe, Unix, and then PCs leading to client/server systems.  Next, with a move into a start up consulting firm I worked in the manufacturing/supply chain systems arena primarily for manufacturing high tech clients in the Pacific Northwest.  Here the focus was on both custom client/server applications and packaged software focused on advanced planning for supply chain.

 

After 11 plus years at Intel I have had the opportunity to work in a variety of roles in IT/IS environments.  Most of these focused on delivering Enterprise Resource Planning (ERP) systems.  I have worked on the strategic front end defining roadmaps and budget to implement ERP.  I have delivered multiple large ERP components to various divisions within Intel.  And have had the opportunity to run a consolidated ERP support organization of up to 175 people worldwide.  I have also had the opportunity to work on multiple efforts such as B2B (Rosettanet, Web, etc), Reporting/Business Intelligence/Analytics, etc which tend to be at the periphery of ERP efforts but at the end of the day are also critical to their success.  In all of these roles the common denominator has been that I have always been a people manager while at times also carrying the program manager title.  I mention this since this is my vantage point.  There are those who are strictly people/resource managers and their focus is on developing people and there are those who are individual contributor program managers who focus strictly on the programs and leverage matrixed resources.  I have managed teams where all resources down to the analysts and the programmers reported to me and then, as I am doing now, have managed product managers, program managers, project managers, technical leads, and architects without the bulk of the resources.  But I think what permeates my perspective is the fact that I am a manager with responsibility for a team that is responsible for delivering key ERP programs.

 

In terms of what you can expect from me with these blogs, here are some of the topics that I work with, come across, and interest me:  Program/Project Management, Resource Management, Program Management Office (PMO), Roadmaps, yearly Budgeting, Steering Committees/Management Review Committees, Roadmaps, Methodologies for ERP (e.g. Agile, Waterfall, etc), Processes (e.g. Program Lifecycle, CMMi, etc), Value of programs, metrics, teams (geo dispersed, large vs small, in-house/outsourced/contract), Supply Chain, and whatever else I may be working with in the course of delivering ERP solutions at Intel.  I welcome your inputs for additional topics as well.  I am looking forward to not just sharing my own thoughts but also learning from fellow travelers in the field of delivering ERP.

In the spirit of continuous improvement and site optimization, Intel has decided to begin consolidating communities here within Open Port.  The IT Playground was launched about a year and a half ago and was designed to be a fun zone dedicated to games and funny videos and other entertaining content.

 

There were some really cool rock videos posted for the Hard Rock Soft Rock campaign featuring Christopher Guest of Spinal Tap.  I think this was the only time I ever saw people rocking out to value propositions like "hardware based remote manageability" and "even while the OS sleeps you can make your updates".  Those songs were really catchy and I found that I couldn't get them out of my head for a couple of weeks.

 

There were promotions for games like Robobrawl and IT Manager 2 online games.  There were funny PEBKAC (problem exists between keyboard and chair) videos showing typical problems that IT managers face on a daily basis and how those could be resolved with Intel technology.

 

But at the end of the day, there just wasn't any value created for you - our users.  What have we done?  Well, we still have all that content, but it was merged in with the vPro Expert Center.  The message that we got was that the community just wasn't being cared for the way it needed to be, so we consolidated so that we can continue to focus on what matters.

 

First, Happy Earth Day Everyone!

 

A couple of weeks ago, I told you about a small proof of concept we conducting to measure energy use in the office environment and to then use that established baseline to test different energy saving methods. This PoC is currently in a planning stage, but we hope to start physical metering within the next few weeks. Stay turned for more info.

 

Today, I’d like to quickly tell you about a little effort to increase awareness of energy use in the office. Often, we have little to no understanding of how much energy we are using, nor how much it is costing us. Awareness can positively influence behavior and reduce energy use. Several studies of energy use in the home have show awareness of real time whole house energy use resulted in the voluntary reduction of use by 10%-15%. To help increase awareness internally, IT measured the energy use of several items found in a typical office such as, desktops, laptops, LCD displays, and phone and headset chargers. This information will soon be published internally via a simple web page showing a photo of a typical office space. As the viewer moves their mouse over each device in the office, a pop-up will show how much energy that device uses in various states. Below the photo is a general summary of how much energy and money could be saved if various generalized behaviors were changed. Lastly, there is a link to more details. It’s very simple, but should greatly help increase awareness of energy use in the office.

 

How about you? Are you doing anything to help increase awareness of energy use in your office spaces? Would something like this work for you?

 

-Mike

Is the value of patch management decreasing?  Some experts say, due to a rise in privately held vulnerabilities, the value of patch management is eroding.  Others feel patching is losing the race and becoming too little and too late with the rapid development of attackers.  I too have chimed in on the topic and stated patching all vulnerabilities is not economical, as most are never widely exploited.  But does this mean we should be looking at alternate paths, away from patch management?  I stand firm in support of the end-node update concept, but take a slightly different view of the scope and value.

 

I see ‘patch management’ as the strategic capability of managing end nodes.  I consider the delivery of ‘patches’ as a broad term which includes OS, application, and hardware BIOS upgrades which can benefit the security posture of the device.  This includes and is akin to the widely accepted delivery of security product updates for anti-virus, anti-spyware, firewalls, etc.  Some of which are updated daily.

 

Attacks are constantly changing.  They normally take advantage of poor coding practices, use design functionality in unintended ways, or exploit avenues to misguided end-user judgment.  The ability to update systems is crucial to maintain security equilibrium.  It is a support function for systems to adapt to new threats.  This capability has a multitude of benefits, both strategic and tactical.  Being able to reach out to systems allows for a better understanding of the number, type, and usage of systems in the environment.  An effective system can paint a picture of systems at risk.  It is a sweeping means to close identified vulnerabilities in deployed code, which can reduce the exposure surface.  It can be used to respond to compromises and drive clean-up activities.  Such services can raise the general security level of a community and may drive to a more homogenous security stance, which strongly lends towards efficiency.

 

Mapping ‘patch management’ against a Defense in Depth Information Security Strategy model shows it allows for Prevention of exposure to known vulnerabilities where patches exist.  It can provide Detection capabilities to improve alerting of attempted as well as successful attacks.  Once systems are compromised, this Response function aids in the restoration of services back to a norm state.  The combination of indicators generated in these areas may assist in efficiency improvements and be used to comprehend future trends, therefore providing a potential Prediction opportunity

 

Overall, actively managing end-node security via ‘patch management’ is very important.  I doubt any serious security professional is advocating turning off all patch or remote system security updates.  The value may vary over time and across different systems, but we have a lot of control in how this capability evolves and the value it returns.  We are empowered to maximize the return on investment.

 

The question still remains, from a measures and metrics perspective, how best can we show and quantify the benefits, efficiency, and value.  The industry as a whole has yet been able to adequately or consistently tackle this challenge.  That discussion is fodder for another blog.

Last year Intel announced the ultra mobile PC using Intel(R) Atom(TM) Processor N270 (1.60GHz) . I commented on its particular incarnation as the Purse PC. I was holding my breath waiting for this to come out. The designer PC is here. I first saw it in a major luxury retailer’s catalog and then in a magazine ad. It is designed by Vivienne Tam and is adorned with large pink and orange chrysanthemums.

Men, don’t stop reading here. It’s your chance to buy your significant other a piece of electronic equipment as a gift and get away with it!

I am still waiting for one by one of my favorite designers, something that is iridescent black with contrasting corners, or pinstripes, or white on white damask…or…retro. We need variety to create market mass and user identity differentiation. But I think the floral treatment is a good start. It is beautiful.

And...for the security minded, who the heck is going to steal this thing from you in an airport?

“Pardon me, I believe that is my Vivienne Tam PC you are carrying underneath your trenchcoat. I am sure it was just a mistake that you confused it with your black standard-sized notebook.”

The ideal theft-proofing will be the DIY design-your-own PC skin at the vendor website, using a host of design elements and color, spray paint effects, etc., all baked in of course. That screams: “This is not my Dad!” via remote control. You saw it here.

Next question: what are people putting on their purse PCs. I have to know! And what is the demographic? Click for some clues. The purse PC does have a matching carrying case, and couture dress, if you can afford it. But I notice that on another site someone complained that there was no mirror on it. Possible new graphics app: Mirrorware. Hmmm.

Research in how bacteria communicate and cooperate may be the future lessons of how computer malware evolves.

 

Bacteria and malware evolution

I recently watched a fascinating presentation by Bonnie Bassler on how bacteria communicate.

My information security brain started thinking of the similarities between the evolution of computer malware and bacteria.  Bacteria over the course of billions of years, devised the most efficient way to communicate, survive, and even destroy large and complex systems.  This may be the most logical path for the successful evolution of computer malware and a peek in the future of information security challenges.

 

Bonnie is a passionate and articulate speaker who outlined how these simple single cell critters work as a team to coordinate activities in a perfectly synchronized manner.  Their actions are stealthy, methodical, and can accomplish incredible objectives through teamwork on the scale humans have never achieved.  They infect, quietly multiply, and wait.  Bacteria independently determine the size of their community and decide to act based upon rudimentary communication and awareness.  When conditions are right, a level of potential virulence is attained, they team up in the billions and act in a choreographed manner.  And they do it simultaneously to bring down their target.

 

In many ways, computer malware act similarly to bacteria.  Malware infects computers which are part of a large community.  Malware and bacteria want to remain stealthy until ready to strike.  Malware exists as basic lines of code with simple rules.  Bacteria are organisms which behave in simple ways.

We are seeing the malware industry evolve with more ambitious goals.  Infection of a single node in a network is no longer sufficient to achieve desired objectives.  Malware must be developed to meet new challenges.  Bacteria are the masters at infiltration, stealth and surprised coordinated attacks against behemoth adversaries.  In the future, malware may take some lessons from it biological doppelganger.

 

So how may malware evolve?

Malware design may shift to very small autonomous pieces.  Modern malware is generally a single package of standalone code which may exist as a file or attach itself to other code.  Deciphering of this complete nugget will typically reveal all its secrets.  In the future such code may be broken up like pieces to a puzzle.  Each piece means very little and appears harmless. Only when they come together does the malevolent picture come into view.

 

Code will replicate itself and seek deeper penetration to all manner of systems.  With little risk of the big-picture exposure, these pieces can be distributed and replicated much more.  Computer environments are full of innoxious code such as temp files, random packets, application remnants, and unneeded data.  Most code and data is ignored unless deemed dangerous.  These pieces can quietly infiltrate many different operating systems, applications, data, and communication traffic of clients, servers, storage, and network devices without raising alarm.

 

Malware will be very quiet, acting locally and not attempting to communicate outside of the environment.  Much of today’s malware is detected as it attempts to communicate with command and control systems outside of the target network.  Evolution of malware code will be harmless, quiet, and unnoticeable until the right success conditions are met.  Local community awareness via ‘quorum sensing’ between the pieces within a target environment would likely not be detected.  Only when the right elements are in place will the pathogenicity be realized as unified activation is initiated and virulence is rapidly achieved.  This will offer little chance for security to offer a meaningful response.

 

Malware has a lot to learn from its slimy cousin.  Maybe someday malware writers will become as smart as these microbes.  On the upside, security can learn from the same teachers.  Just don’t blame our microscopic symbiants of malice, as we exist in their world.  The battle continues.

 

One of the areas I’m focusing on for Intel IT is Sustainability.  My specific area of focus is energy use in office spaces, including but not limited, to client systems.  As I got started, one of the first issues I discovered was a general lack of baseline data on how much energy is used in the office space today.  If you want to implement changes to save energy, you have to really understand where you are starting from.

 

So, before I can really try some proof of concept activities to reduce energy use in the office, I’ll first need to establish a solid baseline.  I hope to soon start a small proof of concept to physically meter energy use in the office space.  One the baseline is established, several additional phases will take place to see how energy use can be reduced via; awareness, software, and electrical controls in the office.

 

I’ll be back with more info as we get started but, in the meantime, I’d love to hear from you.  Have you already looked at energy saving in the office environment?  Is it something you are considering looking at in the near future?  What are your thoughts, concerns, focus areas?

 

-Mike



Over the last two years our software inventory solution has evolved as our policies and procedures were adopted throughout the company. We started shifting our primary focus away from inventory reduction (end of life) and I (personally) have began to target other software solutions and how they play in the management of our software (as a portfolio).

 

That means not only what it is, where it is and who manages it, but now starts to include cost, projects and people. One key contributor to how effective we will be includes how we map our software solutions to an internal capability framework. We are starting to see where we have overlap in features as well as gaps in delivering key capabilities.

 

Internal we have a few key solutions that contain this information and although it has been a challenge to connect the data, we are making strides towards the total solution.

 

Our shift towards Application Portfolio Management (APM) has a goal of painting a better, more complete, picture.

It is a challenge.

 

If you are on the same journey, please drop me a note. Company always makes the drive more enjoyable -- as well as shared learnings keeps us all from getting lost along the way.

In a recent TechRepublic article, Jason Hiner asks: Are Netbooks quietly driving us to Thin Clients and Cloud Computing?

Of course, the article is primarily about netbooks and how wonderful they are. No argument here. But the question of thin versus cloud has popped up in an interesting way. Thin is important because of the nature of the netbook but what does that have to do with cloud computing? Not all cloud applications are thin.

Perhaps the logic is as follows:

  • If cloud then we are delivering services over the internet
  • If internet then we must be using a browser
  • If browser then the computing must be taking place in the backend with only the UI distributed to the client device
  • Therefore all cloud devices must be thin

So what about rich clients? We happen to think that they are perfectly suited to cloud computing. Maybe our latest whitepaper on Better Together: Rich Clients and Cloud Computing can help set the record straight – or at least prompt some alternate thinking.

I was wandering around the house last night listening to the persistent hum of technology all around me. I realized that I might possibly have more technology in my house than pieces of furniture, so I started to add it all up to see what in the world had happened to my once peaceful and quiet life.

 

Starting from the fibre coming in I've got a Netgear* switch. Attached to that is a linux server that faces the world, and a Netgear Wireless-N router that manages the internal house network. The Wireless-N supports a Vista* 64 desktop, Vista 32 notebook, XP* Notebook, my iPhone*, and a Linksys* Wireless-G router in another room.

 

Off the Wireless-G router we've got a Vista 32 notebook, XP desktop, two Apple* Airport Express* devices (for music streaming to living room and back yard), and a Dell* Wireless-G access point.

 

Off the Wireless-G access point we've got the Xbox* 360, Nintendo Wii*, DVD player, and DVR connected (for on-demand downloads).

 

It's actually more complex than it needs to be, but only because I haven't finished upgrading the house to Wireless-N, and because I still haven't got around to building a media center PC in the living room which could replace the DVR, DVD, and one of the Airport devices. (by the way, I'm going to be building my own Core i7* based system in the next couple of months - so stay tuned for a play-by-play)

 

I'm actually amazed at all the tech that is required to do some very simple things that I want to do - like streaming music from my computer across the house into the stereo in the living room. Right now we're using an Airport Express to do that (along with iTunes* and AirFoil*), but why can't my stereo have a network plug so I can send to it directly?

 

Which begs the question, why isn't EVERYTHING in the house network aware at this point, I mean it's 2009 right? My TV should be wireless or bluetooth at a minimum, so should the stereo. And while I'm at it, why not the refrigerator so that I can constantly pull an inventory based on RFID tagging of products...and the thermostat so I can remotely manage it...and the alarm system...and the outdoor sprinkler system...the list goes on.

 

But can it be too much? Even if I close the office doors at night I can still hear the hum of hard drives, CPU fans, router fans - it echos all over the house (not to mention heats up the house quite a bit). I really don't want my house to become a data center, I just want to be able to do what I want to do! I want music no matter where I am, I want to watch movies on any system in the house, I want to be able to access every computer from every other computer...my demands aren't that big are they? :-)

 

So I want to hear about YOUR home network. What gadgets are you using to techify your life, and what recommendations do you have for others who are designing their home? I'm ready to learn how I can simplify!

 

* Product and vendor names are copyrights/trademarks of their respective companies.

Filter Blog

By author:
By date:
By tag: