1 2 3 63 Previous Next

IT Peer Network

937 Posts

A while ago I found a blog post titled, ”What is Hindering Cloud Computing Uptake?” written by Walter Bailey of CloudTweaks.com. I was interested to see what had he come up with to describe why the uptake remains low. Bailey states that cloud computing uptake remains low because people are not enthusiastic about moving their business into the cloud. Additionally, he adds five other possible reasons:

 

  1. IT functionalities (control, security and privacy)
  2. Infrastructure problems (systems are old and do not function well in cloud environment)
  3. Talent shortage
  4. Budget and cost implications
  5. Pooling of resources

 

His arguments reminded me of an experience I had a couple years ago. I was visiting the CIO of a large organization and told him about the importance of cloud computing to allow him to respond quickly to the needs of the business. He laughed, telling me it typically took the business two years to make their mind up, and so there was no reason for him to respond fast. Remaining polite, I did not argue. I was just asking me when he was made aware of the needs and whether the perception of the business people were similar. I also asked myself how much “shadow-IT” was being used. I bed there was a lot.  I called his approach resistance to change. He will most probably close the limelight on traditional IT. Obviously I never heard back from that account.

 

The reasons identified above will play a role, but with the exception of the first, I don’t believe they are the main reasons. There is a key reason that is missing. Let me explain.

 

Earlier this week, I was talking to an account team. They were describing their customers’ IT objectives. On the operations front, one objective immediately caught my attention. They wanted to reduce the provisioning of a server from 30 to 60 days down to one day.

 

Organizational issues

So, I asked the question, “Why it does it take between 30 and 60 days?” They diligently explained to me their process. First, the server needs to be installed in the datacenter. That is initiated by the server team, but performed by the facilities guys. So, a couple days are lost to get the approval and do the work. Then the cables have to be connected so the server can be made operational. That’s another couple days. Then, before the server is recognized in the environment, it needs to be entered in the asset management system. That roughly takes a week. You then need an IP address attached to the server; this requires the intervention of the networking team. And those guys have a heck of a lot of work these days, so count two weeks before it’s done, and then…By now you got the picture.

 

If we could calculate the actual amount of time spent working on the server, we would be well under 1 day, but by the nature of the organization and its processes, that one day turns into 30 to 60 days. 99% of the time the server is waiting. You know, that reminds me of my manufacturing days. Toyota and others have developed Kanban to address this. So, could they do the same? Not sure, as the issue here is the siloed structure of their organization.

 

Traditional IT organizations are focused on specific technologies. You have the server guys, the storage people, the network team, the database administrators, etc. Automation can help, but is often resisted because there is a feeling many operators will lose their jobs if processes are automated.

 

About a year ago, I ran a cloud discussion with a customer IT department, and it completely failed. It’s actually the only time this has happened to me, so it’s not readily forgettable. I should have known right from the moment the customer IT department presented themselves that I was set for failure. I had the responsible for small windows server, the one for large windows servers, the one for Linux servers, the one for NAS storage, the one for SAN storage, the responsible for firewalls, the head of security, the data center network guy, the WAN guy etc.

 

My presentation covered all aspects of cloud including security, compliance and risk management. As I always do, at the end of the presentation, I asked them if I had addressed their expectations. And a nearly unanimous answer came back: “You have only spent 5 minutes on my subject.”

 

Transform the IT organization

Cloud computing cuts across all these silos and requires collaboration between the different subject matter experts. Organizations have “anti-bodies” rejecting ideas and people that change the way things are done. It’s often called “resistance to change,” and is deeply ingrained in many teams. It’s often related to the fear of losing control—of having to transform.

 

Cloud Computing, as soon as you get beyond IaaS, is a game changer, so no wonder people are nervous. It forces IT professionals to transform their job. And that is often painful. Unfortunately, these days, change is the only constant and that applies for all of us, including IT professionals.

 

What makes it more complex though, is the fact IT needs to transition between two models—as IT departments will run their traditional and cloud environments in parallel for the forcible future. The teams focused on the legacy applications can continue operating the way they have been, while the teams delivering the new applications and services need to adapt to the new environment. So, IT organizations will have to combine two approaches, and that is complex and difficult.

 

If we agree that the hybrid cloud is the way to go—that IT becomes the strategic broker of services to the business—four key functional teams are required:

 

  1. A Cloud Platform team operating and managing the environment required to deliver to the business users, in a transparent way, internally developed services, as well as services sourced from external sources. This platform includes an integrated portal, a service catalog, as well as the environment required to deliver and intermediate/aggregate services, as I described in a previous blog entry.
  2. A service development team, developing the “core” services that the company decided need to be kept within the company.
  3. A sourcing team, focused on the sourcing and intermediation/aggregation of all “context” services the enterprise sources from external sources
  4. A services governance team, reviewing with the business the services that need to be provided, managing with them the service lifecycle and identifying the core and context services.

 

Build a unique user experience

Users want specific services, and if IT does not deliver them, the users will go around IT and find what they need. That’s what is called “shadow-IT.” The issue is that users do not always realize the implications of the choices they make. It’s up to IT to deliver to them the services they require through either developing or sourcing the appropriate services. The user is most often not interested in where the service comes from; he/she just want to use it. The easier this is done, the better. So, IT should build a unique portal through which the user can use all the services required to do his/her job. The complexity of accessing the service should be shielded from the end-user. This is precisely what we try to achieve with converged cloud, develop a unique user experience. Aggregation and intermediation of services from external sources should be transparent. The benefit for IT is that the sources can move without having to migrate users.

 

Conclusion

Yes, control, security, compliance (e.g. privacy) are inhibitors to the use of public cloud. But if we take cloud in its widest sense (both private and public), the biggest inhibitor of all, for enterprises, is the way the IT department is organized. Many CIOs realize a change is needed, but they have to battle against very strong resistance to change. The fact the traditional and cloud environments will have to live together for the forcible future makes their job more difficult. The most advanced understand this and act accordingly. They will have a bright future as they will be increasingly integrated with the business. Many are in a wait and see mode, and some just deny. Where are you on this journey? How are you addressing the integrated world of traditional and cloud environments?

 

Christian Verstraete is the Chief Technologist Cloud at HP and has over 30 years in the industry, working with customers all over the world, linking business and technology.

 

Follow Christian's blog

IT departments have likely seen an increase in employee requests for tablets or mobile devices as the consumerization of IT continues to shape and change the corporate landscape. Whether staff need network access under a BYOD scenario or are being issued company-owned devices, cost and performance are two important factors when determining what devices to purchase.

 

A low-cost tablet can be an effective way to provide the mobility features employees want without having to make a substantial investment in new equipment. In some cases, staff can be outfitted with a 2 in 1 device that will serve their needs for productivity as a traditional laptop while giving them the touchscreen display of a tablet (see our TCO report Save with a 2 in 1 Ultrabook). To help with this decision process, Intel commissioned Principled Technologies to review three Android tablets to determine which would be the most cost effective for staff.

 

-IT Peer Network Administrator

 

Principled Technologies reviewed the Dell Venue 7 – powered by an Intel Atom processor, along with the Samsung Galaxy Tab 3 7.0 and the Google Nexus 7 – powered by Qualcomm Snapdragon processors – to see which had the best combination of web browsing experience and cost.

 

To start, the Dell Venue 7 was the least expensive device at $149, followed by the Samsung Galaxy Tab 3 7.0 at $169 and the Google Nexus 7 at $229. Next, Principled Technologies tested the browsing experience on popular websites like nytimes.com, abc.com, huffingtonpost.com, nba.com, and others, and found that the Dell Venue 7 and Google Nexus 7 had the same relative experience. The Samsung Galaxy Tab, however, had several display issues, including choppy scrolling, blurry text, and slow load times.

 

Because manually testing websites can be subjective, Principled Technologies used their WebXPRT and Mobile XPRT technologies to mirror four HTML5 and JavaScript-based workloads: Photo Effect, Face Detect, Stocks Dashboard, and Offline Notes. As you can see from the tables below, the Google Nexus 7 had slightly better performance than the Dell Venue 7, but both out-performed the Samsung Galaxy Tab.

 

Screen Shot 2014-04-15 at 10.37.10 AM.png

Based on these tests, Principled Technologies concluded the Dell Venue 7 tablet was the best device when considering price. Though the Google Nexus 7 had equal storage and a better web browsing experience than the Dell Venue 7, at almost $80 less than the Nexus 7 the Dell Venue 7 was considered to be the better value.

 

Of course, the Dell Venue 7 isn’t the only Intel-based tablet on the market. Intel is committed to improving security on Android tablets and working with hardware partners on optimized 64-bit processors for the platform, so more robust devices are likely to be rolled out to market soon. Carefully choosing what mobile device is best for your organization requires planning, and often times a 2 in 1 device might be more appropriate for staff that travel on a regular basis, but hopefully this quick guide will help IT decision makers save time and money when selecting a tablet.

 

For a more in-depth review of the three tablets, check out Principled Technologies “A better Android tablet experience for your money.” Are you considering purchasing a tablet for your enterprise? If so, what features are necessary for your staff? Leave a comment below or join the social conversation on Twitter using the hashtags #ITCenter and #tablets.

20 Questions on SSD: Things you’ll want to ask

 

Question #5: How can I use Temp, Tier, and Cache + Intel SSD?

S3500Small.png

 

So it’s been a few weeks since my last blog and I want to switch gears away from talking about the Intel Data Center Family of SSDs features. We’ve gone over features like production qualification, endurance requirements, power-loss protection, and finally consistent performance. What I want to talk about in this blog is solutions. Specifically, today we have a number options to leverage SSDs without going to the extreme of replacing your hard drives (HDDs) 1-for-1 with SSDs for any particular application.


The solutions team I work in at Intel has spent the last year exploring the benefits of our SSDs in a wide array of environments from Big Data to Virtualization with many stops in-between. The interesting commonality in all of these explorations is that in many cases, the best ROI and TCO benefits come from using the SSD as temp, tier, or cache! There are a few use cases where 1-for-1 replacement of traditional HDDs pays off, but for the most part… repeat after me; temp, tier, and cache... think lions, and tigers, and bears, and a zoo!

 

In the ‘temp’ space, (wow, that pun was even a stretch for me) we’ve seen goodness in Hadoop with jobs that produce intermediate data by changing the “mapred.local.dir” over to an SSD and with relational databases by moving ‘TempDB’ to a local SSD. In tiering with virtualization, we demonstrated a number of solutions where we built low-cost 100% SSD software based SAN and moved VMs into these high-performance NFS or iSCSI datastores. Using SSDs as buffer/journal/cache, we’ve looked at software based scale-out storage solutions such as VMware’s VSAN, PernixData, Microsoft Storage Spaces, and open source options such as CEPH. In the pure cache space, we’ve also looked at several different caching software packages, including Intel CAS (Cache Acceleration Software), and how these packages can benefit Enterprise IT workloads.

 

So, there’s plenty of opportunity to leverage SSDs to accelerate your enterprise workloads. The question is, “will your workload benefit enough to yield ROI or a TCO improvement?” Which brings me to my second point, your workload really matters! In my IT past, I often handed off storage workloads to the SAN (Storage Area Network) team. When application performance issues arose; we looked at storage, determined if that was the cause, and requested more IOPS or throughput from the SAN. In contrast, most of the temp, tier, or cache implementations studied by this team focus on locally attached storage or DAS (direct attached storage). This being the case, the application engineer and the systems engineer must work hand-in-hand to look at the workload, details of the IO, and in the case of our testing… determine how to best employ SSDs as temp, tier, cache, or as a final option, complete conversion to SSDs (muahahaha)!

 

Let me illustrate with a brief walkthrough. Almost 2 years ago now, we published this paper on accelerating database workloads with Intel SSDs. In this paper we did a 1-for1 replacement of local 15k disk drives with SSDs. If we were to repeat this exercise today, we’d need to investigate how much ‘TempDB’ was used, assess whether or not a solution like Intel CAS could help, evaluate features like ‘Buffer Pool Extension’ in Microsoft SQL’14, and then finally look at wholesale replacement of the 15k data drives with SSD based on both the IOP and capacity requirements of the application.

 

The point here is there are now many more opportunities to realize the benefits inherent in SSDs while keeping capital spend to a minimum. A brief query on one of my favorite retail sites this month (April '14) tells me that a 240GB Intel SSD DC S3500 runs roughly the same price as one of the major OEMs certified 10k SAS drives at 300GB, about $250.00. So in $/GB there’s still a 1.25x difference in cost.

 

I’m looking forward to the day when by default SSDs are the first choice in local storage and they’re getting close. In fact, for uses like boot/swap the price is close enough to warrant having a RAID 1 SSD solution in-box and an extra Ace in the engineer’s sleeve. Until then, there are a lot of things I can do with temp, tier, and cache. The question for the reader is can you use one of these methods to help your applications and data center run faster while providing a good ROI or decrease in TCO? Could you do more with less and increase efficiency by leveraging SSDs?

- Chris

 

Christian Black is a Datacenter Solutions Architect covering the HPC and Big Data space within Intel’s Non-Volatile Memory Solutions Group. He comes from a 23 year career in Enterprise IT.

Follow Chris on Twitter at @RekhunSSDs.

 

#ITCenter #SSD

IT Center

2 in 1 Devices Cut IT Costs

Posted by IT Center Apr 11, 2014

As the workforce becomes more mobile, many companies are considering tablet devices for departments like Marketing and Sales in order to make completing tasks easier while on the go. Though tablets can be beneficial in several circumstances, IT should first look to new 2 in 1 machines, instead of purchasing and managing both a laptop and a tablet for each employee.

 

-IT Peer Network Administrator

 

In order to determine potential savings, Prowess Consulting analyzed three-year costs for an organization that plans to purchase two devices for each staff member: a sub $1,000 laptop and either an Apple iPad Air or an Android (Samsung Galaxy Note 10.1) consumer tablet. In our scenario, both the laptop and the tablet would be managed, secured, and supported by IT, even in a BYOD scenario. Then, they compared those three-year costs against purchasing and managing an Intel vPro 2 in 1 HP EliteBook Revolve 810 G2 for each employee.

 

Though initial hardware costs might assumed to be higher for IT departments purchasing two devices instead of one, the HP EliteBook Revolve 810 G2 device cost about the same as the combined hardware cost of the laptop device and either consumer tablet. Similarly, the price of software across the different scenarios was considered to be roughly equal.

Graph 1.png

The cost advantages of the HP EliteBook Revolve 810 G2 became obvious when they added in the costs of the hardware support, replacements, and most important -- the costs to deploy, manage, secure, and support. (The savings would become even more dramatic if you factored in the costs associated with additional peripheral devices and accessories, and the time employees must spend managing files and content, however, these were omitted from the analysis.)

 

With both employer-purchased and BYOD devices, the organization and workers benefit with the Intel vPro 2 in 1 solution because such devices have the built-in remote manageability and security features of the 4th generation Intel Core vPro processor and enterprise-grade manageability features available with Windows 8.1 Pro.

 

For example, faster hardware-based data encryption can secure information without slowing down work, and two-factor authentication with Intel Identity Protection Technology (Intel IPT) can help prevent unauthorized access to devices and business data. Plus, IT can configure, diagnose, isolate, and repair an unresponsive infected PC using remote support and monitoring capabilities embedded into select Intel vPro processors.

The analysis shows that the associate costs of support, management, and security nearly doubled in the scenario of IT having to deploy two devices per employee instead of one. Simply put, the management of devices doesn’t scale.

 

Aside from lower IT management costs, users can enjoy a better work experience with the Intel vPro 2 in 1 because it can run full-featured Microsoft Office and other applications, which workers rely on, in tablet mode. iPad and Android tablets do not support these applications natively and offer less capable alternatives. Additionally, iPad and Android tablets do not match the browsing functionality or multi-tasking capabilities of Windows tablets, restricting Web-based productivity or the many tasks that require switching back and forth among multiple applications.

 

Therefore, instead of purchasing both a traditional laptop and a tablet device, a single HP EliteBook Revolve 810 G2 would be a less expensive and more secure alternative for the organization and provide a better laptop plus tablet experience for users.

 

To read the longer analysis comparing the two device scenarios, read the Total Cost of Ownership white paper and for more information on the total cost of ownership, check out the latest video and infographic that highlight how much a business can save when it switches to a 2 in 1 device.

 

Are you weighing new device options? We want to hear from you! Leave a comment below telling us the different factors that influence your decision-making. And join the social conversation on Twitter by using the hashtags: #ITCenter and #2in1

One of the challenging transitions teams need to make when they adopt ITIL Service Management is
thinking in terms of Services rather than Products.  Products are the means by which services are delivered. However, the Customer usually cares about the service, not the delivery system.

 

By definition, Service Management is about providing services to Customers who do not want to own the processes themselves.  Payroll is an example of a service that was once commonly provided within the finance department. As payroll legislation and processes became more complex, companies developed the capability to deliver payroll in their “computer department”.  Payroll became a service to finance. Once the service was defined, with requirements, costs and performance expectations, the Customer could then do comparison shopping, selecting the best service provider for the best price. Customers could contemplate whether the service would be “contracted” in house or outsourced to an external service provider like ADP. 

 

Defining services falls within the ITIL process groups Service Strategy and Service Design. Service Strategy articulates the services needed by the Customer and Service Design lays out the requirements for the service.  Defining services in the absence of the Service Strategy analysis is difficult for existing product-based organizations, but eventually leads to the same end state if all of ITIL Service Management is adopted.

 

Immediate benefits can be realized. Duplicate services are sometimes identified that had previously been disguised by product offerings.  Also, teams might be performing activities whose services are hard to define, and perhaps are not even needed. These discoveries  lead to great discussions between the Customer and the service providers when writing the contract known as the Service Level Agreement.

 

The Service Level Agreement is where Customer expectations are reconciled with the Customer’s budget. Costs are applied to Customer requirements, and the Customer then has the opportunity to choose the level of service that maximizes their profit. Instead of the Customer arbitrarily insisting on the best possible service, the Customer can think thru the service level that is truly needed, based on the actual costs to the business. With expectations clearly articulated, the service provider knows the expected service level targets, and can then demonstrate to the Customer in future meetings how well the expectations were met, and if not, what changes are required going forward.

 

One of the benefits of a service-based organization is the potential for innovation.   If the services can be distilled so that the true services are defined, the service owner can then contemplate alternative and possibly inventive solutions for delivering the service. The emergence of cloud computing is a great illustration of innovative solutions providing the same or better service but freed from the products from which the service was previously provided.

 

IT Service Management can result in innovation and a very pleased Customer – two outcomes not usually anticipated when companies embark on the service management journey.

IT Center

We Are the Perimeter

Posted by IT Center Apr 9, 2014

This Malcolm Harkins blog originally appeared here. It is republished with permission.

 

End users, not technology, define the boundaries of the enterprise. Security strategies must protect this new perimeter.

 

You and I, engineers and software developers, every single employee in your company are part of the collective “we,” and as such each of us has a critical role to play in fortifying our security perimeter and preserving privacy. That perimeter surrounds our direct enterprise, our data, as well as the products, services, and solutions we deliver to our respective customers.

 

The world of computing is rapidly evolving. The traditional model where desktop computers were essentially fixed in place and security and privacy efforts were primarily focused on protecting the network perimeter is obsolete. So what replaces this model?

 

With corporate laptops and tablets, and employees bringing their own devices, the security perimeter has shifted, and it is now swayed by the choices of each employee. This shift brings tremendous benefits, but not without a significant potential for risk. In today’s fast moving environment the question becomes, have you, and every employee within your company, stayed abreast of what it takes to protect your security perimeter?

 

Are you prepared to secure your company’s perimeter as the Internet of Things (IoT) expands at an accelerating rate?

 

This chart from Daily Infographic*, prepared by Xively.com*, shows the projected trajectory for the IoT, and underscores the need to ensure each and every one of us is prepared to embrace these advancements with proper security and privacy precautions.

global internet device forecast.png

http://dailyinfographic.com/wp-content/uploads/2014/03/Xively_Infographic-2-1.jpg

 

To be clear, the Internet of Things should make our professional and personal lives easier; but it will not necessarily make them simpler – at least in the short term.

 

In conjunction with the incredible growth and opportunities ahead, we must identify early on the implications that emerging technologies will have on how data is collected, handled, stored, shared, managed, respected, and deleted.

 

When one careless choice can threaten an entire company’s security perimeter, what steps do each of us need to take?

 

The answer is as simple as it is daunting. Each of us needs to increase our knowledge of potential risks and apply it consistently as we make security choices.

It means fundamentally changing company culture. It is not enough to understand how technology works and connects; we must internalize the privacy and security implications of those tools and connections and create an environment that proactively and automatically takes the right actions.

 

It means educating others within our companies and communities about the opportunities and risks each person’s security choices entail, in a way that is easily understandable and compelling so people are both empowered and motivated to make the right choices.

 

It means watching out for each other so when one person’s actions open a potential security or privacy gap, another person is right there to help prevent the risk. It’s the “if you see something, say something” concept, applied to security and privacy.

 

It means recognizing that technical controls alone cannot protect us from rapidly changing attack structures or the complexity of new technologies. It’s time to step up to the expanded role and be part of the solution; we are all a vital part of the security and privacy perimeter.

 

* Other names and brands may be claimed as the property of others.

Managing the Changing IT Landscape: Big Data Analytics

 

There are multiple approaches to building data analytics models to solve a problem. leaderboard final.pngThe Intel-sponsored March Machine Learning Mania Kaggle competition has shown us that. The second stage of the contest inspired 440 entries—that’s 440 versions of a predictive analytics model —from the 251 analytics teams. My congratulations to the winner Grimp Whelken, a Kaggle novice, whose data model best predicted the winners of the NCAA men’s basketball tournament. Here are the final top 10 teams from the competition.

 

Lessons for Business from the Big Data Dance

 

Throughout this contest I’ve been interested in how elements of this predictive analytics contest can apply to business and IT. My previous two blogs in this series explored the skill shortage of data scientists and the importance of real time prediction. As the competition closes and I write my final blog, I noticed that the top 10 teams are almost all single players. But were they working alone?

In the competition, Kaggle players were supported by data derived from a sports ratings website and this information was pre-packaged and provided to the contestants by Kaggle.

 

In the real world, identifying the appropriate data, prepping it for use, and validating the model requires a close partnership between IT and the business. Business analysts with their domain expertise often provide a critical perspective on a big data problem and its relevant data. In the Kaggle contest Ken Massey, someone with recognized domain expertise, essentially served that role for each team.

 

leaderboard final 2.png

 

Small Teams, Big Potential

 

Intel follows a big data project model that empowers small teams to pursue initiatives that can be accomplished in six months and promise $10million in return on investment (ROI). Intel has successfully completed more than a dozen such projects, and is now pursuing higher value opportunities with $100 million ROI. These five person teams (much like a basketball team) tap a variety of skilled positions including IT, advanced analytics, and business experts.

 

Intel’s approach is validated by Tom Davenport, author of the recently published Big Data at Work. Davenport found that the large companies he interviewed for his research were forming teams of people with a range of skills rather than hiring PhD level data scientists on a large scale. The teams included people with quantitative, computational, and business expertise as well as skills in technology, and change management.

Teams bring together the range of skills needed to tackle big data projects—business, analytic, and technology. How does your organization partner with the business to deliver projects of high ROI?

 

Personally, I was very excited to see both UCONN Men and Women win an historic sweep of both NCAA basketball tournaments. Go Huskies! Looking forward to next year to watch the prediction engines in action again!

 

Chris Peters

 

Find Chris on LinkedIn.
Start a conversation with Chris on Twitter.
See previous content from Chris.

#ITCenter #BigData

Long-awaited 64-bit mobile computing is finally here! While 64-bit computing is nothing new, its effects on mobility will be significant. As Intel’s Julie Coppernoll described to PC World, the impacts 64-bit computing will have on mobile computing capabilities include “improvements in video performance, compression, decoding and other intensive computing tasks with a 64-bit chip and compatible OS. Mobile devices will also be able to pack more than 4GB of memory, which could provide performance improvements.”

 

64-bit capabilities allow for 64 bits of data to process at one time, rather than the traditional 32 bits – see a visual depiction of this processing increase in the infographic below:

Screen Shot 2014-04-07 at 11.22.18 AM.png

(View Full Infographic Here)

 

Learn more in A Map to 64-Bit Computing.

True story, doing a migration of your PC on a Monday would not be the best way to start your week, but Mondays are always great days to start planning your migration to a new PC.  Let's start by looking at a typical day, you have 86,400 seconds every day, what if you could spend them being more productive?  Spend less time waiting and spend more time on creating content for your business?  Over the last few months, I have been watching the clock countdown to April 8th,  the EOS of XP and started to ask " are you ready? ".  Now personally I'm on a Windows 8, Detachable 2in1 which has blown my mind for giving me a tablet and a laptop all together.  Let's get back to the subject at hand.

 

First thing I would do on a Migration Monday would be to explore what the difference is between what you have today vs. what is available.  Now this isn't just a Power & Performance discussion, this is about all the cool form factors that are in the market.  Let me offer up this --

 

bloginfo.png

 

Read the full report.

 

I hope you join me every Monday to talk about Migration and what you are doing to move your business forward.  Until then..

 

Josh Hilliker

As someone who loves technology and pays particularly close attention to enterprise IT trends, it brings me joy to see pragmatic tech evolution happening around me. As I type this blog from my home desk, I can see a stack of paper-based Intel IT Annual Performance Reports many of which I helped produce during from 2009 - 2013.

Now in it’s 13th year, the Intel IT Annual Performance Report has taken on a new form. It no longer exists in hard copy. Instead, it has moved to a fully digital experience, and is delivered primarily through a mobile application. Another key change is that this report is now issued quarterly, not annually.

 

These two changes represent a significant shift in how IT needs to operate: on a faster cadence and utilizing increasingly digital and mobile delivery systems to improve business results.

 

I dug into this years report, now titled the Intel IT Business Review and found a few interesting observations on how Intel IT is transforming its operations and providing corporate leadership as we embrace a new business environment.

 

On the Inside IT Radio Show, Intel’s CIO, Kim Stevenson discusses the changes coming for the Intel IT department in 2014, highlighting the importance of the SMAC Stack (Social, Mobile, Analytics, and Cloud) to enable new customer experiences, enable a scalable way to deliver applications, and ultimately save money.

 

She says that nearly 85% of Intel’s office and enterprise environment is in the cloud. Other big changes include a focus on data center management within a Software Defined Network model, and the increase of internal analytics that has allowed IT to write predictive models to help sales and marketing teams drive revenue.

 

Enterprise organizations are all focusing on these initiatives, knowing that business value is now supported by fast, secure technology. General Motors CIO, Randy Mott talks about his focus in bringing GM’s applications up to date.

 

 

This focus on enterprise applications is crucial for IT. As I mentioned in my blog titled The Evolution of Enterprise Mobile Application Development  "[...] the app development process is not without its challenges. There are an increasing number of connected devices and device types running on a range of platforms. Manageability and security challenges are top of mind, especially when it comes to protecting sensitive corporate data. And working with an open-standards-based platform is becoming increasingly important."

 

For example: the Intel IT Business Review is Intel’s first enterprise app that adheres to the Intel IT five-star application program. The five-star system was created by Intel IT in order to ensure that enterprise apps are developed with all stakeholders in mind, paying careful consideration to security, intuitive user experience, multi-platform, multi-device, and multi-interaction functionality.

 

Over the course of my career, I’ve watched IT move from a support role to a strategic driver of business value, enabling decisions based on data, experience, and technological advancements. As roles continue to shift, leadership in IT is paramount. As Intel’s CIO Kim Stevenson notes, “structure follows strategy.” Seeing the stacks of hard-copy Intel Annual IT Performance Reports on my desk and now holding the reader-friendly app in my hand really hammers home the quick evolution of IT in the enterprise.

 

I’ll be writing more blogs about the articles and information found within the app, but I hope you’ll check it out for yourself – I find it fascinating to take an inside look into the IT department for such a large organization.

 

For more information, please visit the Intel IT Business Review homepage.

Recently, I was fortunate to be asked to give a presentation at this year’s GigaOM Structure:Data conference in New York City. David Strom was the moderator, and we had a 20 minute conversation on some of Ford’s recent work and where we are using Data Science. A big thanks to David for a great session. If you’re interested in watching, check out our chat online.

 

One of the original “rock stars” of data science, Hilary Mason (@hmason) was also at the conference. I was in the green room while she was speaking on a panel, so I couldn’t hear the entire session, but I did get to briefly chat with her afterward. If you haven’t already seen it, check out her presentation on learning about your neighborhood through machine learning and data science through hamburger joints.

 

Unfortunately, I was only able to spend one day at the conference, but I enjoyed the format that was more conversational than most conferences that I attend. It was also interesting to hear about how GigaOM is moving into expert-based research products, similar to Gartner and Forrester.

 

Talking with vendors is always interesting. Sometimes I’m just curious about a new technology, but you’ll really get my attention if you can answer some key questions:

 

  • Who are your competitors? If you answer “We don’t have any competitors.” You’ve lost a lot of credibility.
  • What do you offer over your competitors? A lower price point is good; more or better features are better. But, if you say you have a lower price point and better features, I wonder if you’re lying.
  • If you make it past the first two questions I’m interested in who are your current customers and if can I talk to them.

 

Many of the vendors are trying to embed “data science” into their tools and bring data science to the business analyst. I applaud the idea, but the implementations left much to be desired. Firstly, they seem to expect the data to be very clean. Anyone who’s worked in this area knows that data cleaning and understanding can be 80% of the work. Secondly, I wasn't convinced that the insights passed the ‘so what’ test. One of the toughest tasks for a data scientist it to provide insights from patterns in data and this is hard to capture in software. Most of the business analysts know their data well and are looking beyond simple correlation and sensitivity analysis.

 

In my teams role as internal consultants, we’ve had considerable success bringing value to the enterprise by merging datasets across different domains. A good example is our work on the 2013 Escape power liftgate. But we spend much of our time doing data cleaning and creating translation tables. Commiserating with colleagues from Intel we wondered why it’s taken so long for companies to attack this problem and we’re hoping that companies like Trifacta and Praxata will help.

 

What did you think of GigaOM? Did I miss anything?

 

Michael

 

Read more blogs posts from Michael

Follow Michael on Twitter

Few groups are as critical to creating company value as its sales force. The effectiveness of an organization’s sales force is almost entirely contingent upon the technology used in the field. Without proper hardware - and software - the mobile workforce is left without proper support and therefore unable to fully contribute to the business. The critical functionality is a combination of hardware and software that can be completely ineffective if not paired correctly. Choose the right mobile device but the wrong platform and you’ll be left with a lack of functionality and likely some very frustrated employees.

 

So what offering is the best combination of functionality, usability, and productivity to keep reps on the go? When it comes to equipping your sales force with the essential tools, never forget there are two sides to the coin. The device and the platform are of equal importance and should always go hand in hand. What follows are highlights from a third-party case study testing Salesforce.com® Sales Cloud® on three mobile devices. Notice the variety and results and remember how those variables could affect your mobile workforce when in the field. Seconds lost on simple processes add up overtime to both minutes and dollars. Don’t waste either.

 

-IT Peer Network Administrator

 

Can your sales force be equally effective with Salesforce.com® Sales Cloud® on any mobile device? To find out, a third party put Sales Cloud to the test on three mobile devices.


Screen Shot 2014-04-04 at 10.49.37 AM.png


They tested several scenarios typically performed by sales reps, including meeting with prospects and converting them to qualified opportunities, setting up a presentation for stakeholders, and closing the deal by generating a quote for the customer. Only the Dell™ Venue™ 11 Pro tablet running the full-browser version of Sales Cloud allowed full functionality. Both other devices lacked several features critical to sales reps, including the ability to convert leads to opportunities, generate quotes, and view reports.


If you’re a sales rep or sales manager, you know how important it is to have customer records and sales data at your fingertips. Sales Cloud reliably delivers this vital information to Windows devices through the browser, and to Android™ and iOS® mobile devices through a dedicated Salesforce1 mobile app. Based on this test, the ideal requirements for an optimal user experience when accessing Sales Cloud on the go were identified as being: mobility, usability, and functionality. Of the devices tested, only the Dell tablet powered by an Intel processor and running Windows 8.1 hit the sweet spot encompassing mobility, usability, and functionality.


Sales Cloud users on the go need access to the tools that help them close deals. If their platforms require multiple workarounds, or require them to switch devices throughout the day, productivity and profitability can be severely hampered. The testing showed that of the devices tested, the Dell tablet powered by an Intel processor and Windows 8.1 offered the best combination of faceted user experience. The Dell tablet running the full-browser version of Sales Cloud allowed access to all features and functionality. Salesforce compatibility with Windows applications on this device enabled smooth integration with Outlook and Office—both critical tools for many organizations. And because the full-browser view puts more fields at your fingertips, performance of common tasks was much higher on the Dell Venue 11 Pro.


If you want the best combination of usability and productivity for your sales reps on the go, mobile devices powered by Intel® Core™ processors and Windows 8, such as the Dell Venue 11 Pro, offer clear advantages over other tablets.


To read the full study, head to Comparing Salesforce.com® Sales Cloud® on Mobile Devices to learn more. Also check out the blog: You're Only as Mobile as Your Device: How PepsiCo Found Mobility, to learn more about how PepsiCo is utilizing mobile devices to transform how they sell.

Are you currently re-evaluating the needs of your sales force? What pairings of hardware/software seem ideal for your organization?


To join the conversation, comment below or join us on Twitter using the following hashtags:

#ITCenter #tablet

"So how are you handling the death of the PC?"

 

It happens all the time: I'll be talking to someone about technology and the conversation will turn to how Intel is coping with the death of their core business. It's an interesting question but one that's based on a shaky premise; the idea that the PC industry is dying.  Are PC sales slowing down?  Sure. Does that mean personal computing is dying?  Not really. The fact is the PC is evolving, just like so many other technologies have.  PC's, those large, bulky, noisy boxes that used to sit on our desks and floors, are evolving into several different form factors as a response to how consumers want to receive their content and work in a computer environment. It may seem a little like side-stepping, but this is actually what's happening. Someone who might have bought a tower PC ten years ago might be in the market for a PC today and decide on something smaller and more attractive. He or she may want the power and functionality of a desktop PC but not the loud fan or the unattractive cables running from the box to the monitor. Something like the Intel Next Unit of Computing fits that role nicely. For all practical purposes it IS a PC, just a very tiny (and quiet) one. What's more is it can be attached to the back of the monitor, putting it neatly out of site.

 

Let's say another user needs a PC for their home business but also likes the idea of being able to move it from room to room easily. So rather than disconnecting a bunch of cables, lugging a large tower and monitor into another room every so often, that person might instead opt for a portable all-in-one design. It gives them the power they need but also the portability of just picking it up and carrying it into a different room.

 

And then there are the users who want to be fully portable. They want to be able to carry their device with them and work on the go, but they still need the power of a PC. They need the graphics and the processing power. They'd opt for a tablet, no question.

 

All of these devices have something in common at their core: they're all personal computers. We just don't call them that. Each one of them has its roots in the PC and is simply an evolution of that technology. Each one of them is a personal computing device. It's personal because it's yours, and it computes stuff for you.

 

Some people equate the PC to the dinosaur; they once ruled the Earth but now they're completely gone. I think that misses the point. We shouldn't be focused on the PC, but on computing itself. We may be seeing less of the PC as we've known it for the past few decades but computing is alive and well.  It wasn't too far back in history that most of us got around in horse-drawn wagons. Wagons didn't "die", they evolved into cars. So while the wagon gradually disappeared transportation is alive and well. These days none of us want to hitch up a wagon to get somewhere, but I'm pretty sure we still need to get there. That's really what evolution is all about; better ways of doing something. Sure the size and shape of the box will change, but the core idea of what it's there for stays the same. What we need to do is offer consumers a compelling, efficient, effective computing experience that's better than the old one. When we're making the "next big thing" we have to ask ourselves; is this a cool new toy with lots of pretty lights, or does it actually improve on what we already have?  A watch you can take a call on sounds neat, but if it's not more convenient than using your phone then why bother?  A tablet that lets you edit spreadsheets is great, but if doing it that way is harder than on your PC then why change? Make something better, faster, more efficient and it will take off like a rocket.  Make something different for the sake of different and it will likely fail.

 

So when someone asks me about the death of the PC I remind them it's not about personal computers, it's about personal computing.

Jason Hoffman

Read more posts written by Jason

Information Technology Infrastructure Library (ITIL) and Service Management for many have become generic terms divorced from their true meanings. ITIL is sometimes regarded as just another optional IT (Information Technology) methodology like ISO 9000 and COBIT. Service Management is often an amorphous concept known to be somehow related to ITIL.


Why should people who work in IT care?


They should care because together, IT and Service Management create the best method for providing maximum customer satisfaction, and for retaining customers rather than have customers search for alternate IT service providers. Unsatisfactory relationships with customers have, in extreme cases, resulted in outsourced IT, which is not a good scenario for either side of the relationship.

 

Service Management provides a way to define, measure, and improve the services between the customer (who is paying IT for services) and IT (the service provider).  In the past, many in IT viewed themselves as “providers of products”, such as application software or the hardware upon which the
software runs.  Service Management views IT as a provider of services, regardless of the products used.  The importance of the distinction is critical, and illustrates why IT departments sometimes fail to comprehend the services that the customer truly needs. Thinking of IT as a collection of services can be a new way of viewing what the role of IT is in a company.

 

Cloud computing vs. managing a data center is a clear illustration of the distinction between providing a service and managing a product. A data center is a product and one possible solution.  A cloud computing environment is an alternative solution, but provides generally the same service. The customer doesn’t care if the computing environment is virtual or physical – they just want the best service obtainable for the lowest cost that meet their requirements.  IT departments need to define themselves as services provided to the customer, so that they are free to supply the services with the best solution possible.

 

ITIL is a decomposition of the processes that comprise an Information Technology organization so that there are no overlaps and no gaps. Among the processes is Service Level Management, which is represented in the agreements between the customer and the service provider. These agreements clearly delineate the service being provided, the service level support, and the associated costs. With expectations and costs documented and agreed upon, there can be little misunderstanding regarding service level requirements. Documentation also provides the bases for periodic reviews, so that customers can decide whether evolve or change services, and whether to increase or decrease funding based on the business needs.

 

IT organizations that fully embrace service management become indispensable to the business, because they enable the business to maximize their potential.

 

Larry Taylor

 

Read more about Larry

Connect with Larry on LinkedIn

It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.” —Charles Darwin


In the new world of IT, it really is survival of the fittest. Adaptability, agility, and innovation are critical, and staying on top of the mobile tech transformation might be the key to success.


It’s no longer about IT. It’s about IoT.

Consumerization has forever changed the enterprise landscape, and the role of IT has shifted. Today, IT needs to focus on innovation and flexibility to survive the Internet of Things (IoT). It’s about enabling billions of connected devices to simplify workflows, enhance business processes, and bring even greater business value to customers.


Forrester says we’ve entered the "Age of the customer" with tech-savvy, empowered buyers. This, too, will continue to evolve, and enabling mobile enterprise apps will become an integral part of business success.


age of the customer.png

IT will play a critical role in developing, integrating, and securing mobile apps that can utilize embedded sensors to capture data and turn that data into useful business information for decision makers. It’s about delivering the right apps, with the right level of security and functionality, to most any mobile device.


Managing complexity with the right approach.


Yet the app development process is not without its challenges. There are an increasing number of connected devices and device types running on a range of platforms. Manageability and security challenges are top of mind, especially when it comes to protecting sensitive corporate data. And working with an open-standards-based platform is becoming increasingly important.
 

With the right tools and best practices, you can minimize complexity in mobile app development. InformationWeek’s guidelines to enterprise app development include basics like enabling the right level of connectivity and supporting push technology, but they go one step further. The third step is gaining full manageability of the apps, which is increasingly critical in a mobile environment. And last but not least is adopting open standards.

 

I recently read an article about the top enterprise app development trends that echoes these themes. (And full disclosure, the Darwin quote above came from this article.) There is a distinct advantage to be gained by adapting proactively to the technology changes in the IT landscape.


Companies that are embracing the mobile tech transformation are forging ahead. Cisco just announced its Enterprise Mobility Services Platform as part of the company’s larger Internet of Things (IoT) strategy. Prashanth Shenoy in the enterprise mobility group at Cisco said it best: “It's no longer an IT conversation. This is all about empowering the business.”


IT will play a critical role in developing, integrating, and securing mobile apps that can utilize embedded sensors to capture data and turn that data into useful business information for decision makers. It’s about delivering the right apps, with the right level of security and functionality, to most any mobile device.


Intel IT takes a cross-platform approach.

 

For Intel IT, part of adapting includes taking a cross-platform approach to mobile app development. By using a hybrid model based on native HTML5 code and a web browser, they are able to simplify development and support a rich user experience.

enterprise mobile application framework.png

The framework is built on a mobile application development platform (MADP) that is supplemented with security capabilities like authentication and encryption. It has four primary components:


  • Standardized development environment
  • Secure runtime environment
  • API aggregation gateway
  • Mobile management

 

You can find out more about the Intel IT framework by reading the white paper, “Implementing a Cross-Platform Enterprise Development Framework.”   You can also tune in to this webinar, where I had the chance to host one of Intel IT’s lead app developers. He discusses why a cross-platform approach is best for Intel’s enterprise. 


I’ll be exploring these topics in future blogs, and I welcome your thoughts as well. How is your IT organization adapting as the Internet of Things continues to take shape?

Chris Peters

Find Chris on
LinkedIn.
Start a conversation with Chris on
Twitter.
See
previous content from Chris.


#ITCenter #EnterpriseMobileApps

Filter Blog

By author:
By date:
By tag: