Skip navigation

I've been working with application streaming for the past year or so and often when I go to other groups within Intel to try and convince them that streaming is the way to go - they ask me - why bother? They reason that they are moving more and more to web based applications so why would they implement streaming for the applications they might have left written in a client side language.


Well - that's a good question - but it's interesting that when I speak to the management team this is their response but when I talk to developers - they are very excited (usually) about this possiblity.


Why? Because most developers would rather develop in a language that is more robust than you typically get with a web application tool. They like the ability to have more control over the graphical experience for the user and the ability to do more powerful things within the language. The management team is looking at where current momentum is going and they don't want to change the direction. But should they?


Is there a compelling reason to move to more client based applications and away from web applications?


Should we have both in our toolkit? - If so - where should we use which one - why would we ever build more client based (not sure what to call this, where the execution of logic happens on the client) applications?


To evaluate this question I think it would be good to go back to why we started to move to web based applications in the first place - for those that have been around long enough to remember when all we had was client based execution of applications (after mainframes of course). Unfortunately I have been around for all of the above and have been working with applications the entire time (since 1976).


So I find this interesting - anyone else?


In my next entry - I will start to explore this question to see if we can build a case for moving to at least a more balanced view of where we put logic execution.


It has been an extremely long time since my last post... I am just coming off of successful left hip surgery January 2.


Sam Lawrence's blog post - highlighted the definitions below: (




Social networking

is that it's been focused on people connecting to people that they already know. Not that there's anything wrong with that, I love to find the people I know online but for me it's usually a quick, "I found you, cool now we're connected." I'm sure that there are stories out there -- particularly with job searches -- where social networking has been instrumental. Outside that, it seems that more often it's card collecting. I can see that Jill knows Steve but I'm no closer to knowing Steve, I just know that Jill knows him. I do like knowing there's a bag of potential contacts, even if I never use them.


Social Bookmarks

have a great purpose, too. I can see what other people mark as interesting content. I have no connection to them personally, but social bookmarking allows me to snoop "good readers" and track their information consumption. I follow the tags and feeds of a number of people but I have never said one word to them. I enjoy reading over their shoulder, though. It saves me a lot of time.


Social Productivity

is different than social networking or social bookmarking: it's about getting work done outside the team of like-minded people you work with everyday. With social productivity, an idea is introduced and all sorts of people get to chime in on it. These could be people you work with a lot, people you've never worked with or even people outside your company. Now all of a sudden your idea has been developed openly by all sorts of people who bring their own, valuable perspective. You can evolve those ideas into all sorts of collaborative or locked content but thanks to the social media, your original idea is maybe much stronger now.


There are many questions to be explored and answered. How do we get this to work in your company? Inside the firewall? Look there are many items that work for our outside interests. Facebook, MySpace, LinkedIn do some of the social networking. Do we need this inside the firewall? Are people going to take advantage of it? Will this just be an increase in email, IM and phone traffic?


How do we move forward to the real deal of getting work done. I spent some time with folks that provide tools in almost all of these spaces. In theory and watching them demonstrate how they work and can work - all look good. Basically, the real meat on this bone is how will we change our behaviors to make this successful?



One of the emerging compute models is enterprise streaming.  With streaming applications and/or operating systems are downloaded to clients for temporary local execution.  End-users access software on-demand.


Intel IT has been evaluating streaming as a way to boost productivity and lower costs.  Streaming can streamline IT operations by consolidating backend infrastructure, while preserving user experience. 


Listen to the podcast for a brief introduction to the work taking place in Intel IT.



For decades there had been a simple model in place when it came to consuming software within large companies. It had only two branches, one involved creating something new while the other installation and configuration. Simply stated: "Make versus Buy."


Make decisions involved the custom development of an application to fill the requirements of the consumers. This means that the

software development tools, development resources as well as migration, testing and hosting capabilities all had to be maintained internally.


Buy decisions analyze the funcationality versus consumer requirements as well as the costs of purchasing, licensing, supporting and the installation, migration, testing and hosting capabilities necessary. Additionally the company providing the product is considered. They are usually a company who specializes in a product or product grouping and can deliver it

at a lower cost than what it would take to build it internally. Oftentimes they can also provide the upgrades and support at a cheaper cost assuming the product meets all your needs out of the box.


As the applicaiton portfolios of companies became larger, analysis began to include another branch. Instead of building something new or buying a product to install, you would expand upon the capabilities of an existing tool through merging and/or simply enhancement. This means our simply model is now: "Make versus Buy versus Enhance."


Enhance (or merge) decisions brought together the consumers of the current application and those wishing to have additional funcationality.  The amount of regressive testing would increase andt the overall architecture of the application had to be considered to prevent the creation of a Frankenstein application; not adhering to your internal guidelines.


Much of what I read today seems to be leaning towards a trend in large companies to consume software produced and hosted by someone else. You would think this is the "Buy" branch discussed above, however, the method for both consumption and installation is different. This increases our decision tree to now include "Make versus Buy versus Enhance versus Rent."


Rent is a paradigm shift from conventional close-to-chest business practices most companies have used in order to keep competition at bay.  Now imagine a time when all you do is start your computer and load a web browser.  Inside the browser you have access to all document creation and management, business tools, messaging and any other functionality you need to perform your job.  Tthe difference here is that none of these applications are inside your company and you only pay as you use.


So where does this leave us as software developers?  Are our days numbered?


I think not -- yet.  The movement to a rent-based consumption model takes time.  Time for the company to get over their fears or releasing some control to someone else.  The problem is and what most people do not realize is that we do it daily.  Think about the electricity that runs your factories and offices and ask yourself where that comes from.  Do you create it yourself or do you consume

it as a utility in a renting fashion?


For a while software developers will be performing the following:

  1. Building what does not exist

  2. Enhancing

  3. Merging

  4. Configuring

We eventually will be doing less and less coding and more and more configuring.  As the industry providing us software (and the infrastructure) matures and the reliability increases you will see a switch.


It will take time.  Time to settle concerns, time to change opinions and time to move over data and consumers.


I imagine that this switch will allow those companies to focus more on their key products and less on the outlying functionality necessary to run the business.


What are your thoughts?

Several Intel IT folks (and others!) have expressed concern over the back-end implications of hosting a streamed computing solution. How many clients can be supported by a server? How will streaming affect the network? Well, we had the same questions so we constructed a lab experiment to find out.


Streaming was more efficient than we expected. We demonstrated that server utilization remained low and network utilization improved over time. We successfully executed a variety of applications including audio and video. We also encountered a few challenges.


Want to know more? Read our full report: Streaming and Virtual Hosted Desktop Study

Promise of WiMAX is around the corner.  Check out this site that puts you in the passenger seat of automobiles around the country.[PocketCaster WiMAX Demo|]


For some perspective, I walk more than 10 feet with my laptop at a brisk pace, and I'll have a connection drop... this is simply amazing.  Makes you wonder what we'll be able to do with phones and handheld devices in the comming year (see Moorestown: Much more than an iPhone killer ).   Can't wait!

Another twist to energy conservation is energy reuse. It is possible to use the waste heat generated in a data center for non data center heating needs. This can be thought of as Energy Recycling, a method of reducing the impact of data centers on the environment. A great opportunity to enable a data center for energy reuse is at their conception as a goal during the design. With the growing number of data centers being built, now is the time to integrate energy saving and reuse methods. Data centers located in climates with a need for heating, will hopefully recover and reuse a majority of their operating energy for heating non data center areas and adjacent buildings in the future. Since data centers operate at a relatively constant energy demand, they offer a stable and near constant source of heat. The colder the environment, the more financially attractive it can be to install the equipment needed to recover and transfer the heat from the data center to adjoining areas or even neighboring buildings. Imagine data centers located in downtown areas of the city sending off their heat energy and heating other buildings as if they were a utility provider!


Read Data Center Heat Recovery Helps Intel Create Green Facility to see more about what we are doing to reclaim and recycle energy.

Each year Intel IT publishes an Annual Information Technology report documenting our key initiatives and how we performed towards them. The Intel Information Technology 2007 Performance Report is now available. It is noteworthy to state that 2007 was another year of substantial change for Intel IT. Some of the key highlights are:


  • We entered our second year of a multi-year replatformization of our ERP environment.

  • We streamlined our decision making and governance by eliminating 67% of our forums necessary to make a decision.

  • We announced our long-range data center efficiency initiative which is expected to achieve a $1B (US) cost avoidance

  • We focused on standardizing and reducing the number of applications.  We removed more than 450 outdated & redundant applications!


We welcome your comments on how we did.  I especially would like to hear how your company's evaluate their IT performance.

Crazy as it may sound, digital appliances and accessories can infect your computers with viruses and worms. It is happening more and more. Although not near a tipping point, an evil cloud is rising.






Unlikely Threats

It is concerning enough we have to worry about USB drives, WiFi hotspots, mobile phones, PDA's, printers, email attachments, file downloads, search engines, and surfing just about any website. But now we must keep a suspicious eye on our new net-enabled refrigerator, digital picture frames, music playing sunglasses, and even the toaster.



Recent articles shows how consumer devices integrated with network enabled computers are sources for malware infections. It is not shocking software CD/DVD's, or USB Drives might have nasty code lurking. Suspicion is the norm anytime we are connecting or installing something directly to our trusty computer. In those situations, we take proper precautions. But what about media players, GPS devices, and most recently wireless digital picture frames? These devices may not directly connect via traditional cable. Does the average consumer realize when they flip the power button they may be turning on a wireless device infected with malware seeking to infect anything within range?





The toaster is out to get you!

It is not just the geek toys anymore. Not to long ago, an enterprising individual took it upon himself to hack a regular toaster, just to prove it could be a source of malware. A toaster! Very impressive, but what is next?



As computers are integrated into everything and are being upgraded with more power and connectivity, the threat landscape grows. Our cars, major appliances, personal electronics, accessories, and even clothing are potentially at risk.  We are dragging these items into the digital world and in doing so, overlaying cyber risks on them.



Although not widespread, more and more stories are emerging and the list of products grows longer. At some point we will be forced to re-evaluate the standard threat categories to include some non-traditional vectors. Personally, I am waiting for shoe manufacturers to implant computers in their products so we can have "walk-by attacks". Can't wait.




Some news reference links:,141295-pg,1/article.html

It seems like the pendulum is always swinging.  First all computing was centralized on a mainframe with users connected via terminals.  Then, over time, computing became more and more distributed with individual PCs and peripherals.  Is it now time to consider a more centralized computing model?


Networking continues to become more robust.  Users have an expectation of connectedness, whether working within the enterprise, at home or on the road on a wired or wireless network.  If you are always connected then you can reconsider how to make the best use of your computing power, locally or on the network, as long as it is fast enough.


Thick, thin or something in between: what are your thoughts?


We went to the street during the The Consumer Electronics Show in Las Vegas and we asked people to respond to something we showed them.  Check out the interesting responses.

If you've attended any of the Intel Premier IT Professional events or if you have been following our Data Center blogs, you're no doubt aware that Intel is in the process of transforming our computing and data processing back end.  We're moving from a sprawl of resources spread across over a hundred data centers to a much smaller footprint.  We've been deploying grid computing and virtual servers to slow the rate of growth of our computing capacity.  We're also changing our operations processes, applying disciplines that were originally developed for our factories to improve the way we manage our data centers.  A little over a year into the project, some of our team (Uttam Shetty, Alan Ross, Brently Davis, and I) have put out a white paper to summarize our goals, focus areas, and preliminary results.  We've uploaded the paper Transforming a Global Data Center Environmentas a resource, which you can read/download.

Filter Blog

By date: By tag: