1 2 3 Previous Next

IT Peer Network

1,368 posts

SNACKABLE-SecurityControls.pngSaaS is not new. It has been used for both business and personal use for some time and for a few years in its cloud form. So what sort of security changes are required to use SaaS in the enterprise? What SaaS challenges is Intel IT encountering? Why now? In this blog I share some real-life SaaS experiences, as a cloud and mobile security engineer at Intel, as well as my view of SaaS security.

 

Matching Strategy to the Current Environment

The emergence of new and large use cases triggered Intel IT to reignite our SaaS architecture and look for more security solutions for the SaaS cloud space. Previously at Intel, use cases for cloud-based SaaS were small and limited to a few users. But the new use cases involved thousands of users, mainstream apps such as data repositories and collaboration, and big business models such as CRM. These large use cases required us to reexamine our SaaS strategy, architecture, and controls to protect those mass deployments. As documented in our recent white paper, these controls center mainly on data protection, authentication and access control, and logs and alerts. We strive to enforce these controls without negatively impacting the user experience and the time to market of SaaS solutions. The paper also discusses how we manage shadow IT—users accessing SaaS services without IT awareness.

 

How We Handle Cloud Traffic Inspection

While the white paper summarizes our SaaS security controls, I’d like to delve a bit deeper into cloud inspection.

 

As is often the case, the right approach wasn’t immediately apparent. We needed to examine the advantages and disadvantages of the various choices – sometimes a complicated process. We investigated two ways we could inspect activity and data:

  • Cloud proxy. In this approach, we would pass all the traffic through a cloud proxy, which inspects the traffic and can also encrypt specific fields, a valuable process in controlling the traffic and information being passed to the cloud provider. The downside of this solution is that the traffic is directed through the cloud proxy, which might cause performance issues in massive cloud implementations, where the cloud provider has many presence points around the globe. Cloud proxies can also impact the application modules, in cases where reverse proxy is being used.
  • Cloud provider APIs. This option uses the cloud provider’s APIs, an approach that allows inspection of user activity, data, and various attributes. The benefit of such an implementation is that it happens behind the scenes and doesn’t impact the user experience (because it is a “system-to-system” connection). But the downside of using APIs is that not all cloud providers offer the same set of APIs. Also, the use cases between SaaS providers can differ—requiring more time to fine-tune each implementation.

We reached the conclusion that each solution needs to match the specific use case security requirements. Some SaaS implementations’ security requires more control, some less. Therefore we believe it is important to have a toolset where you can mix and match the needed security controls. And yes, you need to test it!

Blog-Graphic.png

 

I’d like to hear from other IT professionals. How are you handling SaaS deployments? What controls have you implemented? What best-known methods have you developed, and what are some remaining pain points? I’d be happy to answer your questions and pass along our own SaaS security best practices. Please share your thoughts and insights with me – and your other IT colleagues on the IT Peer Network – by leaving a comment below. Join the conversation!

Back in 1993, when the first 7200-RPM hard drives hit the market, I imagine people thought they could never fill up its jaw-dropping 2.1GB capacity. Of course, that was before the era of MP3s and digital photos, and any videos you had were on VHS or Beta (or possibly laserdisc).

 

Today, desktop PCs like the ASUS K20CE* mini PC come with up to 3TB SSD drive to accommodate users’ massive collections of HD videos, photos, eBooks, recorded TV, and other huge files. That’s terabytes! Some feature even more storage.

 

But how do you access these files away from home? You could use one of the many cloud services on the market; however, if you have lots of personal photos and videos, or large documents and files from work, you’ll quickly reach the cap on the free capacity and have to start paying monthly subscription fees. Plus, you’d need to remember to upload files that you might want to access later to the cloud, and if you want to change services, move your files digitally from one network to another, which can be a hassle, not to mention a security concern.

desktop.png

Access your data anytime, anywhere

 

A better option would be to take advantage of Intel ReadyMode Technology (Intel RMT) and third-party remote access software such as Splashtop, Teamviewer, or Microsoft Remote Desktop to turn your desktop PC into an always-available “personal cloud” that lets you access all of your files on your other devices, such as your smartphone or tablet.

 

“With RMT, your data is stored safely in your home computer so you don’t have to worry about people hacking into it. You can access it through remote log on or through VPN,” said Fred Huang, Product Manager, ASUS Desktop Division. “It’s a better way to access your personal files that exists today with ASUS systems running Intel RMT.”

 

Intel RMT replaces the traditional PC sleep state with a quiet, low power, OS-active state that allows PCs to remain connected, up-to-date, and instantly available when not in use. Plus, it allows background applications—like remote access software—to run with the display off while consuming a fraction of electricity it normally would when fully powered on.

For home PCs, this means you can get the convenience of anytime, cloud-like access to your files without a cloud-service bill, as well as the ability to share it outside your own personal login.Family with AIO.jpg


“Cloud-based storage is usually more personal, so you might have a different account from your spouse or family member, but with a home hub PC, it can be one shared account that the whole family can access,” adds Huang.

 

For businesses, Intel RMT allows employees to use remote access to get to their work files from anywhere without the need for their desktops to remain fully awake and consuming power. Across a large enterprise, that kind of power savings really adds up.

 

Another business benefit: desktops with Intel RMT enable automatic backups and nightly system health checks to happen efficiently during off hours without waking the machines—saving power while protecting files and uptime.

 

The perfect home (and work) desktop

 

ASUS desktop PC allow users to do everything from daily tasks to playing 4K ultra HD video with enhanced energy efficiency, better productivity and powerful performance across all its form factors. Other highlights include instant logins, voice activation, and instant sync and notifications.

 

And don’t forget about the gamers. RMT can help support game downloads and streaming sessions without wasting a lot of energy. Gamers can also choose to run updates and applications in the background 24/7, or overnight, and save time and energy by being connected to an energy-efficient smart home hub. Take a look at this recap video of the always available PC from IDF 2015 last month.

 

 

In addition to the ASUS K20 mentioned above, Intel RMT will also be featured by the future series or succeeding models for ASUS M32AD* tower PC, ASUS Zen AiO Z240IC* All-in-One, and the ASUS E510* mini PC.

 

Want to find out more about what Intel Ready Mode can do? Visit: www.intel.com/readymode.

The practice of using maliciously signed binaries continues to grow.  Digitally signing malware with legitimate credentials is an easy way to make victims believe what they are downloading, seeing, and installing is safe.  That is exactly what the malware writers want you to believe.  But it is not true.

2015 Q3 Total Malicious Signed Binaries.jpg

Through the use of stolen or counterfeit signing credentials, attackers can make their code appear trustworthy.  This tactic works very well and is becoming ever more popular as a mechanism to bypass typical security controls. 

 

The latest numbers from the Intel Security Group’s August 2015 McAfee Labs Threat Report reveals a steady climb in the total number of maliciously signed binaries spotted in use on the Internet.  It shows a disturbingly healthy growth rate with total numbers approaching 20 million unique samples detected.

 

Although it takes extra effort to sign malware, it is worth it for the attackers.  No longer an exclusive tactic of state-sponsored offensive cyber campaigns, it is now being used by cyber-criminals and professional malware writers, and is becoming a widespread problem.  Signing allows malware to slip past network filters and security controls, and can be used in phishing campaigns.  This is a highly effective trust-based attack, leveraging the very security structures initially developed to reinforce confidence when accessing online content.  Signing code began as a way to thwart hackers from secretly injecting Trojans into applications and other malware masquerading as legitimate software.  The same practice is in place for verifying content and authors of messages, such as emails.  Hackers have found a way to twist this technology around for their benefit.  

 

The industry has known of the emerging problem for some time.  New tools and practices are being developed and employed.  Detective and corrective controls are being integrated into host, data center, and network based defenses.  But adoption is slow which affords a huge opportunity for attackers. 

 

The demand for stolen certificates is rising.  Driven by the increasing usage and partly by an erosion effect of better security tools and practices, which work to reduce the window of time any misused signature remains valuable.  Malware writers want a steady stream of fresh and highly trusted credential to exploit.  Hackers who breach networks are harvesting these valuable assets and we are now seeing new malware possess the features to steal credentials of their victims.  A new variant of the hugely notorious Zeus malware family, "Sphinx", is designed to allow cybercriminals to steal digital certificates.  The attacker community is quickly adapting to fulfill market needs.  

 

Maliciously signed malware is a significant and largely underestimated problem which undermines the structures of trust which computer and transaction systems rely upon.  Signed binaries are much more dangerous than the garden variety of malware.  Until effective and pervasive security measures are in place, this problem will grow in size and severity.

 

Twitter: @Matt_Rosenquist

LinkedIn: http://linkedin.com/in/matthewrosenquist

I feel very fortunate to be a part of the hugely exciting culture of innovation that is making its mark in Israel at the moment. The country has a reputation as fertile ground for start-up companies to flourish, but it’s also seeing a rapid pace of technological innovation. I recently returned to Israel after living abroad for a number of years, and the sheer scale of new development is amazing – even more so when you consider our relatively small population. Office blocks and research labs are shooting up, more and more high-end, high-value products are being manufactured, and investments and M&A activity are huge. I spoke to Guy Bar-Ner, regional sales director for Intel Israel, who told me that being part of the Intel Sales and Marketing team based in Israel means he has lots of opportunities to get involved with some of the most exciting developments and play a role in helping drive the industry forward.

 

To put this growth into perspective: there are currently 74 Israeli companies listed on Nasdaq, one of the largest representations for a non US country. The national economy is strong and the high-tech industry is doing well. It’s a great time to be in business here.



Igniting Innovation


Guy said: "Being part of the Intel Sales and Marketing team based in Israel means I have lots of opportunities to get involved with some of the most exciting developments and play a role in helping drive the industry forward.

 

cr0001.JPGWith a large (10,000-strong) presence, Intel Israel is in a strong position to help make a difference. We consolidated this position recently when we opened our IoT Ignition Lab in Tel Aviv. Our vision for the Lab is to provide local companies with the resources, space and tools they need to get their Internet of Things (IoT) ideas off the ground. This is the first time we’ve been able to offer such dedicated support to companies both large and small in the country, and after just two months of operation, it’s already showing promising results.

 

We offer companies that are innovating in the IoT space the opportunity to work with Intel’s technical experts to identify opportunities to develop their solutions on Intel® architecture, and then provide them with the resources to build or enhance their solutions, and a platform on which to showcase them to prospective customers through the Lab’s demo center.

 

The Lab focuses on four key pillars – Smart Cities, Smart Transportation, Smart Agriculture and Smart Home – but provides support and resources for any kind of IoT project that qualifies. At the moment, we’re working on a couple of exciting projects, including a Smart Cities solution from IPgallery, a Smart Transportation/Supply Chain solution from CartaSense and a personalized music solution from Sevenpop.

 

In addition to our work with local IoT companies, we’re using the IoT Ignition Labs to support Israel’s strong (and growing) maker/developer community. We have about 500 of these visionary folks just among the Intel Israel employees. They take part in many maker/developer hackathons and meet-up events during the year.  The size of the overall Israel maker/developer community is amazing, holding up to ten meet-ups on various technology-related topics per week in the greater Tel Aviv area alone. The ideas that this community comes up with are fantastic – in fact it was a team from Israel that won first place in the Intel® Edison Make It Pro Challenge last year.

 

We’re keen to support these innovators by offering access to Intel resources and products to help them build the must-have solutions of tomorrow. We’ve been running hackathons to give them a forum in which to work together and come up with new ideas, and the winners of the hackathons are then welcomed into the Ignition Lab to work alongside the Intel experts to develop their idea into a marketable solution. In addition, the Intel Ingenuity Partner Program (IIPP) is a new program that is now up and running working a select few start-ups to help them build and market their Intel architecture-based solutions. The combination of the IIPP and the Intel IoT Ignition Lab is a fantastic way for start-ups to develop new and exciting solutions.

 

 

Engaging with the IoT Community

cr0007.JPG


Meanwhile, we’re also taking the opportunity to drive further collaboration with the local community of start-ups and innovators at the upcoming DLD Innovation Festival, which is taking place in Tel Aviv in early September. For the first time, Intel will be taking part directly in this event, and we’ll be hosting a number of events and activities at the Intel Innovation building near the main entrance on September 8th and 9th - including

 

  • Speakers with new perspectives: Intel experts in areas such as IoT, wearables, video, media and connectivity will share their thoughts on a range of technology topics beyond Intel’s traditional business.
  • Express Connect: We’ll be offering a match-up service for conference attendees to meet with Intel leaders and topic experts by appointment for more tailored, in-depth discussions.
  • Showcase area: Some of the new and exciting Intel® technologies such as Intel® RealSense™ technology, Intel’s Wireless Connectivity, smart home and advanced analytics solutions will be on display as part of an ‘airport terminal of the future’ area.
  • Live hackathon: Members of Intel’s own developer community will run an IoT-themed hackathon event using Intel Edison to find the next IoT Ignition Labs project. This will be run in collaboration with the Open Interconnect Consortium (OIC) and will highlight how the OIC and Intel are collaborating to create a smarter world.

 

I invite everyone to come to the DLD event to experience Intel’s technology in action and engage with the people at Intel who are creating the future."

 







To continue the conversation on Twitter, please follow us at @IntelIoT 

The role of IT decision maker has dramatically changed in the past few decades, as technology continues to weave tightly into business strategy. IT leaders are helping business leaders build a successful roadmap by implementing strategies built on cloud, analytics, and new digital tools. Big initiatives, however, come with big decisions and the wherewithal to know which projects take priority.

 

We launched a poll on our Intel IT Center LinkedIn showcase page to find out what fires IT decision makers tend to extinguish first. The Internet is inundated with lists, blogs, and articles dedicated to top issues and concerns plaguing IT. These buzz-worthy topics include cloud, security, and big data, and we expected one of those to top the list.

 

Some IT Surprises

IT-Fires-Poll-NSI.pngIn our poll of more than 300 participants, 34 percent pinpointed hardware refresh as their top concern. Cloud structure (20 percent), software refresh (17 percent), and mastering data analytics (12 percent) rounded out the top four.

 

Security finished seventh with a little over 1 percent; this was one of the biggest surprises of the poll, especially with the large number of high-profile breaches and cybersecurity issues troubling enterprises of late. Cloud concerns were lower than projected as well, even after Microsoft’s recent release of Windows 10.

 

Some notables in the “Other” category (which accounted for 4 percent of the results) included customer-facing systems and hiring. Should IT be putting more thought into retaining talent, company culture, or customer needs?

 

IT Decision Makers Pick Hardware Over All Else

As noted, IT executives have a lot on their plate. The majority of respondents are focusing on topnotch hardware first — ditching legacy technology in favor of higher productivity, flexibility, and less downtime. The much-discussed data analytics, cloud, and security didn’t rank as high as we thought, but we’re more interested in knowing what you think. How would you rank your biggest concerns as an IT decision maker?

office-worker-device-integration-tablet-desktop.png

More and more mobile devices are becoming connected with the software that runs on them. But the true value of mobility can’t be realized until these devices take advantage of the necessary integration among the underlying systems.

 

The same principles hold true for mobile business intelligence (BI). Therefore, when you’re developing a mobile BI strategy, you need to capitalize on opportunities for system integration that can enhance your end product. Typically, system integration in mobile BI can be categorized into three options.

 

 

Option One: Standard Mobile Features Expand Capabilities

 

Depending on the type of solution (built in-house or purchased), features are considered standard because they use existing and known capabilities on mobile devices such as e-mailing, sharing a link, or capturing a device screenshot. They provide methods of sharing mobile BI content, including collaboration without a lot of investment by development teams.

 

A typical example is the ability to share the report output with other users via e-mail by a simple tap of a button located on the report. This simple yet extremely powerful option allows immediate execution of actionable insight. Additional capabilities, such as annotating or sharing a specific section(s) of a report, add precision and focus to the message that’s being delivered or content shared. In custom-designed mobile BI solutions, the sharing via e-mail option can be further programmed to attach a copy of the report to an e-mail template, thereby eliminating the need for the user to compose the e-mail message from scratch.

 

Taking advantage of dialing phone numbers or posting content to internal or external collaboration sites is another example. An account executive (AE) could run a mobile BI report that lists the top 10 customers, including their phone numbers. Then, when the AE taps on the phone number, the mobile device will automatically call the number.

 

 

Option Two: Basic Integration with Other Systems Improves Productivity

 

A basic integration example is the ability to launch another mobile application from a mobile BI report. Unlike in Option One, this step requires the mobile BI report to pass the required input parameters to the target application. Looking at the same example of a top 10 customers report, the AE may need to review additional detail before making the phone call to the customer. The mobile BI report can be designed so that the customer account name is listed as a hotlink. When the AE taps the customer name, the CRM application is launched automatically and the account number is passed on, as well as the AE’s user credentials.

 

This type of integration can be considered basic because it provides automation for steps that the user could have otherwise performed manually: run the mobile BI report, copy or write down the customer account number, open the CRM app., log in to the system, and search for the account number. All of these are manual steps that can be considered “productivity leaks.” However, this type of integration differs from that described in Options One because there will be a handshake between the two systems that talk to each other. When using standard features, the report is attached to the e-mail message without any additional logic to check for anything else—hence, no handshake required.

 

 

Option Three: Advanced Integration with Other Systems Offers Maximum Value

 

Of the three options, this is the most complicated one because it requires a “true” integration of the systems involved. This category includes those cases where the handshake among the systems involved (it could be more than two) may require execution of additional logic or tasks that the end user may not be able to perform manually (unlike those mentioned in Option Two).

 

Taking it a step further, the integration may require write-back capabilities and/or what-if scenarios that may be linked to specific business processes. For example, a sales manager may run a sales forecast report and have the capability of manually overwriting one of the forecast measures. This action would then trigger multiple updates to reflect the change, not only on the mobile BI report but also on the source system. To make things more interesting, the update may need to be real time, a requirement that will further complicate the design and implementation of the mobile BI solution.

 

 

Bottom Line: System Integration Improves the Overall Value

 

No matter what opportunities for system integration exist, you must find a way to capitalize on them without, of course, jeopardizing your deliverables. You need to weigh the benefits and costs for these opportunities against your scope, timeline, and budget. If mobile BI is going to provide a framework for faster, better-informed decision making that will drive growth and profitability, system integration can become another tool in your arsenal.

 

Think about it. Besides – how can we achieve productivity gains if we’re asking our users to do the heavy lifting for tasks that could be automated through system integration?

 

Where do you see the biggest opportunity for system integration in your mobile BI strategy?

 

Stay tuned for my next blog in the Mobile BI Strategy series.

 

Connect with me on Twitter (@KaanTurnali) and LinkedIn.

 

This story originally appeared on the SAP Analytics Blog.

2015 Q3 New Ransomware.jpgCybercriminals are fully embracing ransomware.  Ransomware, a specific form or malware, which encrypts files and extorts money from victims, is quickly becoming a favorite among criminals.  It is easy to develop, simple to execute, and does a very good job at compelling users to pay in order to regain access to their precious files or systems.  Almost anyone and every business is a potential victim.  More importantly, people are paying.  Even law enforcement organizations have fallen victim, only to cede defeat and pay the criminals to restore access to their digital files or computers.

 

Ransomware is on the rise in 2015. The Intel Security Group’s August 2015 McAfee Labs Threat Report shows new ransomware growth at 58% for the second quarter of 2015. 


In just the first half of 2015 the number of ransomware samples has exploded with a near ~190% gain.  Compare that to the 127% growth for the whole of 2014.  We predicted a spike in such personal attacks for this year, but I am shocked at how fast code development has been accelerated by the criminals. 

 

Total ransomware has quickly exceeded 4 million unique samples in the wild.  If the trend continues, by the end of the year we will have over 5 million types of this malware to deal with.

 

Cybercriminals have found a spectacular method of fleecing a broad community of potential victims.  Ransomware uses proven technology to undermine security.  Encryption, the long-time friend of cybersecurity professionals, can also be used by nefarious elements to cause harm.  It is just a tool.  How it is wielded determines if it is beneficial or caustic.  In this case, ransomware uses encryption to scramble select data or critical systems files in a way only recoverable by a key they possess.  The locked files never leave the system, but are unusable until decrypted.  Attackers then offer to provide the key or an unlocking service for a fee.  Normally in the hundreds of dollars, the fee is typically requested in the form of a cryptocurrency like Bitcoin.  This makes the payment transaction un-revocable and almost impossibly difficult to track attribution and know who is on the receiving end. 

 

This type of an attack is very personal in nature and specific in what it targets.  It may lock treasured pictures, game accounts, financial records, legal documents, or work files.  These are important to us personally or professionally and is a strong motivator to pay the criminals. 

2015 Q3 Total Ransomware.jpg

Payment simply reinforces the motivation to use this method again by the attackers and adds resources for continued investment in new tools and techniques.  The technical bar for entry into this criminal activity is lowering as malware writers are making this type of attack easier for anyone to attempt.  In June, the author of the TOX variant offered ransomware as a service.  The criminal made available software for other criminals to distribute.  It would handle all the back-end transactions and provide the author a 20% skim of ransoms being paid.  Fortunately, the author was influenced to a better path after being exposed by Intel Security.  More recently an open source kit, named Hidden Tear, was developed for novices to create their own fully function ransomware code.  Although not too sophisticated, it is a watershed moment showing just how accessible making this type of malware is becoming.  I expect future open source and software-as-a-service efforts to rapidly improve in quality, features, and availability.

 

Ransomware will continue to be a major problem.  More sophisticated cybercriminals will begin integrating with other exploitation techniques such as malvertizing ad-services, malicious websites, bot uploads, fake software updates, waterhole attacks, spoofed emails, personalized phishing, signed Trojan downloads, etc.  Ransomware will grow, more people and business will be affected, and it will become more difficult to recover without paying the ransom.  The growth in new ransomware samples is an indication of things to come.

Now more than ever, bookings are on the rise for the cruise industry. Cruise Lines International Association, the industry's largest trade association, estimates that 23 million people will board ocean-bound cruise ships this year, an increase of 4.4 percent over last year.

cruise_couple_mobile_phone_technology.png


With an increasing number of potential passengers desiring to take their first cruise, making strategic business choices that strengthen the relationship between a cruise line and its passengers is of paramount importance. Princess Cruises can say confidently that they have the technology necessary to ensure and support a better user experience for both their customers and their staff. The understanding that customer service is the heart of their business, as well as the support of Intel technology, enables them to position themselves ahead of their competitors. Why is that?

 

Love at First Call

 

Princess Cruises, one of the largest global cruise lines worldwide, transports passengers to more than 300 exotic locations yearly. While providing world-class amenities, activities and entertainment for their guests, Princess Cruises uses more than 6,000 geographically dispersed PCs to handle a full range of business tasks.

 

Of these PCs, 780 make up the company’s worldwide customer call center, which provides the first interaction between potential clients and Princess Cruises’ vacation planners. With all new PCs equipped with Intel Core vPro processors, Princess Cruises can confidently meet customer service demands. Whether guests are booking their first cruise trip, or out in the open sea, exploring exotic locales; these mighty PCs are able to run a variety of business applications that are essential to meeting customer demands and building a relationship of trust.

 

Princess Cruises utilizes a special system where cruise vacation planners will be linked with the same passengers for all correspondence, forever. This special system allows for the passenger to have a more personal and familiar experience when calling for potential updates or questions they may have. This level of customer engagement is made possible due to Intel technology. With a performance increase of 15 percent for Princess Cruises’ internal booking engine, technical difficulties and poor performance issues are things of the past. Passengers are happy to speak with cruise vacation planners who can effectively search, look up and deliver information in a much more timely fashion. Who says love doesn’t start at first call?

Man-princess-cruise-mountains.png

 

Global Connectivity Support

 

One of the challenges Princess Cruises faces is ensuring that their employees are ready to provide answers and solutions to their guests’ questions and problems effectively. The solution to this challenge is making sure that employees are not only skilled in customer service, but are given the right tools to enable efficiency.

 

Being a global business, Princess Cruises has many remote locations, like Alaska, where certain services available for cruise passengers, such as motor coach or hotel services, can only be arranged through applications found on a PC. In the past, if an employee were to experience PC problems, they would be unable to move forward until they received a desk-side visit or chose to mail in their PC for repairs.

 

In our fast-paced world, where guests are increasingly accustomed to instant satisfaction, waiting days for a PC to be repaired in order to fulfill a guest’s request is not only unrealistic, but not a viable option.

 

Luckily, Princess Cruises’ new PCs allow employees to take advantage of the Intel vPro Platform which enables remote management. Finally, no matter where employees are located worldwide, they can be assured that they will have access to PC help.

 

What does that mean? More valuable time that can be spent engaging with customers and building a relationship of trust. With constant accessibility to PC help, employees can feel confident in their tools, and in turn, empowered to provide world-class customer service to their guests.

 

Amplify Your Value.jpgAmplify Your Value and you can Reap the Rewards...that’s kind of the theme of this entire series...how you can amplify the value of your IT department and how your company can reap the rewards. But this post is not a summary of our journey, it is about the next step on our journey. It is a post about an organization moving head first into the cloud, moving head first into buy versus build, and moving head first into changing its operating model, deciding to develop its own loyalty card program and execute one of the most impact “IT Projects” in the 85 year history of our company. But...let me start at the beginning.


It was mid 2010 and I had just joined Goodwill Industries of Central Indiana as CIO. That first week, one of the meetings I had, in fact the first meeting with a peer VP, was with our VP of Marketing. That meeting covered a lot of ground and various topics. One that stood out for me was when she mentioned Goodwill had been discussing gift cards and loyalty cards for about eight or ten years but it never seemed to move forward. She even pulled out a folder that had enough of a thud factor to make any contract attorney jealous. It contained page after page of meeting minutes, email correspondence, and requirements. I was floored...eight years? Of talking? What was the roadblock?


A few days later, I was meeting with the VP of Retail. Again, we talked about a lot of different topics. Sure enough, the conversation soon rolled around to gift cards and loyalty cards. We’ve talking about it for eight years...and we’ve made no progress...eight years? Of talking? What was the roadblock?


That afternoon, I met with a couple of folks from my new staff. “What’s up with this gift card and loyalty card thing?”, I asked. Eight years? Of talking? What was the roadblock?


So, since this is my blog, I get to use my “bully pulpit” to air some dirty laundry and perhaps, according to who you ask, some revisionist history. It seemed, the problem was Marketing blamed Retail’s inability to define requirements, Retail blamed IT for always saying “no we can’t do that”, and IT blamed Marketing for want to discuss ad nauseum, but never move forward. I vowed, this was going to change. So in the midst of our Strategic Planning process, I called a meeting to discuss: gift cards and loyalty cards. After all, it was very near to my sweet spot...early in my career I had spent 12 years in banking, specifically in credit cards. 


As the year progressed, we began to define requirements and search commercial offerings for gift and loyalty cards. Within a few short months, the team decided to separate the project into two phases. Phase one would be gift cards and phase two would be loyalty cards. With that decision, the project kicked into high gear. Given our Point of Sale system and our requirements, we very quickly identified a gift card software provider. Within a few short months, we launched our gift card program.


Several weeks later, we reconvened our team of Marketing, Retail and IT to start on loyalty cards. We further defined our requirements. We wanted a random reward system, not a points based system, we wanted flexibility in the rewards offered, and most importantly, we wanted to track and drive two different behaviors on the same car: shopping and donating. Throughout the winter, we evaluated many off the shelf solutions. However, it was becoming readily apparent that no off the shelf solution was going to meet our requirements. Sure, they all offered flexibility in the rewards, but they were all based on earning points and none of them could track two different behaviors on the same card. Even taking that into consideration, the team was narrowing the selection down to a handful of packages that met at least some of the requirements.


I knew we had to build it. We had to deviate from our cloud-first, buy strategy and build it ourselves. There was no other way. With that in mind, we developed a response to the RFP we had issued. It was basically a general design document of what could be built. We submitted our “RFP Response” to the team along with the two or three commercial packages that had been down-selected. As selection day quickly approached, I made it a point to discuss the proposal in detail with the VP of Retail and the VP of Marketing. I could tell they were skeptical that IT could pull it off. I assured them we could, and quite frankly, played the “new guy card” and asked for a chance.


Our proposal was selected, now it was time to put up or shut up. We engaged with a local firm (Arete Software) to build the initial database and prototype and then shifted to the internal team. As we worked feverishly on the code, the project team defined the goals and the targets for success. The launch date would be November 11, 2011 (11/11/11); we would achieve an 11% increase in retail sales, our average shopping cart would increase by $5, and we would have 100,000 cardholders at the end of the first year.


Over the course of the summer and the fall, the team worked faithfully to hit the target date. Finally...go live...the organization that was moving head first into the cloud, moving head first into buy versus build, moving head first into changing its operating model, launched its loyalty card program...Goodwill Rewards (™).


Yes, we hit our target dates; yes, we hit our budget; but, how did we do on our goals? Our increase in retail sales was 13%, beating our target by 2%; our average shopping cart did improve, but fell short of our goal (our lessons learned review identified some areas for improvement here); and, we blew past the 100,000 cardholder mark in under six months, in fact, at the end of year one we had over a quarter of a million cardholders, today we have over 550,000 (remarkable, considering our geographic territory is 29 counties in Central Indiana...yes, 550,000 cardholders in just 29 counties in Indiana).


To further validate our success, we were awarded the Society of Information Management of Indiana’s Innovation of the Year award in 2012. Additionally, we licensed the software to a couple other Goodwill organizations in the US, turning us into, if not a profit generator, at least a revenue generator for the company.


How were we able to achieve this? First, it truly was a team effort. In fact, I believe one of the most important outcomes of this project was for Marketing, Retail and IT to work together, as a team, to achieve a common goal. Second, our path to amplify our value by leveraging cloud technologies and avoiding C-F Projects (see That Project is a Real Cluster!) enabled us to spend our energy on this A-C project. Third, the environment and culture enabled us to take a risk, to step into the unknown, to ask for and receive the support to move forward.


Next month, we will eliminate even more C-F Projects by looking at disaster recovery in: Amplify Your Value: A Tale of Two Recoveries.


The series, “Amplify Your Value” explores our five year plan to move from an ad hoc reactionary IT department to a Value-add revenue generating partner. #AmplifyYourValue


We could not have made this journey without the support of several partners, including, but not limited to: Bluelock, Level 3 (TWTelecom), Lifeline Data Centers, Netfor, and CDW. (mentions of partner companies should be considered my personal endorsement based on our experience and on our projects and should NOT be considered an endorsement by my company or its affiliates).


Jeffrey Ton is the SVP and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.


Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out more of his posts on Intel's IT Peer Network

Read more from Jeff on Rivers of Thought

Fig1.png

Enterprise IT users switch between a multitude of programs and devices on a daily basis. Inconsistencies between user interfaces can slow enterprise users’ productivity, as those users may enter the same information repeatedly or need to figure out what format to enter data (e.g. specifying an employee might be done with an employee number, a name, or an e-mail address).   On the application development side, code for user interfaces may be written over and over again.  One approach to solving these problems is to create a common User Experience (UX) framework that would facilitate discussion and the production of shareable interface templates and code.    Intel IT took the challenge to do just that, with the goals of increasing employee productivity by at least 25% and achieving 100% adoption.  To create that unified enterprise UX frame work, Big Data approaches were critical, as described in this white paper from IT@Intel.

 

To understand the requirements for the enterprise UX, two sources of data are available, but both have unique problems.  Traditional UX research methods like surveys, narratives, or observations, typically are unstructured and often do not have statistical significance. Usage data from logs have large volumes, and user privacy is at risk.  Unstructured data, varied data, and voluminous data are a perfect fit for Big Data techniques.   We used de-identification (aka anonymization) to hide the personal information of users.  De-identification techniques were combined with Big Data to create a Cloudera Hadoop based analysis platform shown to the right.  Fig2.png

 

Using that analysis platform, Intel IT’s UX team created a single framework standard for all enterprise solutions.  60% of Intel IT’s staff can take advantage of it.   Data from this platform was also used to select and implement a new internal social platform.  The analysis platform has also been used to analyze other aspects of user behavior, which we are planning to write about in a future IT@Intel white paper.

 

In addition to the white paper, more detail on the development of the UX framework can be found in the following papers:

 

Regarding our use of de-identification/anonymization, we talked about our early explorations in this white paper, and a more detailed analysis of the challenges of using de-identification in an enterprise setting our detailed in this conference paper:

Malware development continues to remain healthy.  The Intel Security Group's August 2015 McAfee Labs Threat Report shows malware quarterly growth at 12% for the second quarter of 2015.  In totality, the overall count of known unique malware samples has reached a mesmerizing 433 million. 

2015 Q3 Total Malware.jpg

Oddly, this has become a very stable trend.   For many years malware detection rates have remained relatively consistent at about ~50% increase annually. 

 

Which makes absolutely no sense! 

 

Cybersecurity is an industry of radical changes, volatile events, and chaotic metrics.  The growth of users, devices, data, new technologies, adaptive security controls, and dissimilar types of attacks differ each year.  Yet the numbers of malware being developed plods on with a consistent and predictable gain. 

 

What is going on?

 

Well colleagues, I believe we are witnessing a macro trend which incorporates the natural equilibrium occurring between symbiotic adversaries. 

 

Let me jump off topic for a moment.  Yes, cyber attackers and defenders have a symbiotic relationship.  There, I said it.  Without attacks, security would have no justification for existence.  Nobody would invest and most, if not all, security we have today would not exist.  Conversely, attackers do need security to keep their potential victims healthy, online, and valuable as targets.  Just as lions need a healthy herd to hunt, to avoid extinction, attackers need defenders to insure computing continues to grow and be more relevant.  If security was not present to hold everything together, attackers would decimate systems and in short order nobody would use them.  The herd would disappear.  So yes, a healthy electronic ecosystem has either a proper balance of both predator and prey, or a complete omission of both.

 

Back to this mind boggling trend.  I believe the steady growth of malware samples is a manifestation, at a high level, of the innumerable combined maneuvering of micro strategies and counter tactics.  As one group moves for an advantage, the other counters to ensure they are not defeated.  This continues on many fronts all the time.  No clear winner, but no complete loser either.  The players don’t consciously think this way, instead it is simply the nature of the symbiotic adversarial relationship.    

I have a Malware Theory and only time will tell if this turns into a law or dust.  My theory “malware rates will continue to steadily increase by 50% annually, regardless of the security or threat maneuvering” reflects the adversarial equilibrium which exists between attackers and defenders.  Only something staggering, which would profoundly upset the balance will change that rate.  If my theory is correct, we should break the half-billion mark in Q4 2015.

 

So I believe this trend is likely here to stay.  It also provides important insights to our crazy industry and why we are at this balance point.

 

Even in the face of new security technologies, innovative controls, and improved configurations, malware writers continue to invest in this method because it remains successful.  Malware continues to be the preferred method to control and manipulate systems, and access information.  It just works.  Attackers, if nothing else, are practical.  Why strive to develop elaborate methods when malware gets the job done?  (See my rants on path of least resistance for more on understanding the threats.) 

 

Defensive strategies are not slowing down malware growth.  This does not mean defensive tools and practices are worthless.  I suspect the innovation in security is keeping it in check somewhat, but not slowing it down enough to reduce the overall growth rates.  In fact, without continued investment we would likely be overrun.  We must remain vigilant in malware defense.

 

The rate increase is a reflection on the overall efficacy of security.  Malware must be generated at a rate of 150% per year, in order compensate for security intervention and achieve the desired success.  Flooding defenders is only one strategy as attackers are also demanding higher quality, feature rich, smarter, and more timely weapons.

 

Malware must land somewhere in order to operate and do its dirty deeds.  PC’s, tablets, phones, servers, cloud and VM hosting systems, and soon to be joined more prominently by droves of IoT devices, are all potential hosts.  Therefore, endpoints will continue to be heavily targeted and defense will continue to be hotly contested on this crucial battleground.  Ignore anyone who claims host based defenses are going away.  Just the opposite my friends.

 

At a rate of over three-hundred thousand new unique samples created per day, I speculate much of the malware is being generated automatically.  It is interesting on the defensive side, anti-malware companies are beginning to apply machine-learning, community reporting, and peer-validation to identify malicious code.  It is showing promise.  But just wait.  The malware writers can use the same type of machine-learning and community reporting to dynamically write code which either subverts detection or takes advantage of time delays in verification.  Malware code can quickly reinvent itself before it is verified and prosecuted.  This should be an interesting arms race.  Can the malware theory sustain?  Strangely, I suspect this battle, although potentially significant, may be exactly what the malware model anticipates.  The malware metronome ticks on.

 

 

Connect with me:

Twitter: @Matt_Rosenquist

Intel IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

ll.jpg

I don’t know about you, but while I love being able to browse my favourite store’s latest range from the comfort of my sofa, the hands-on experience that I get from a visit to the store itself is also still very appealing. What’s great about today’s retail landscape is that we have the opportunity to do both. The way we try and buy items from our favourite brands is no longer dictated by the opening hours or stock levels in our local high street store.

 

While this is good news for the consumer, the battle is on for high street retailers. To entice as many shoppers as possible through their doors, retailers need to offer a totally unique shopping experience – something that will convince you and me to put down our tablets and head to the high street.

 

Personalized, anytime shopping on the streets of Antwerp

 

Digitopia, a digital retail solution provider in Belgium, is working with Intel to build devices and apps that retailers can use to create more compelling shopping experiences. By trailing different solutions in various retail environments on Antwerp’s most popular shopping street, Digitopia is helping retailers to define which technologies work best in each different store scenario.

 

On Innovation Boulevard, as Digitopia has dubbed it, shoppers can turn their phone into a remote control to browse holidays on a large screen in the travel agent’s window. They can use an interactive fitting room in a fashion boutique to check for alternative colors and sizes of the outfits they are trying on. It’s even possible to order and pay for their cafe refreshments with a smartphone app rather than queuing up in the store. A large number of the solutions are powered by Intel technologies.

 

For shoppers, the retail experience is smoother and more personalized. Importantly, the technologies are also helping retailers to increase sales, offer new services and continue to interact with their customers when the shops are closed.

 

You can read more about the exciting retail experience that Digitopia has created in our new case study. My personal favorite is the possibility to book a holiday while walking between shops – what’s yours?


To continue this conversation, find me on LinkedIn or Twitter.

coworker-collaboration-with-mobile-technology_rz8kse.png

When developing a mobile business intelligence (BI) strategy, you can’t ignore the role that business processes may play. In many cases, the introduction of BI content into the portfolio of mobile BI assets provides opportunities to not only eliminate the gaps in your business operations, but to improve the existing processes.

 

Often, the impact is seen in two main ways. First, the current business processes may require you to change your mobile BI approach. Second, the mobile BI solution may highlight gaps that may require a redesign of your business processes to improve your mobile BI assets and your business operations.


Business Processes Will Influence Your Mobile BI Design

 

Existing business processes will have a direct impact on the design of your mobile BI solution. I’m often amazed to discover that the lack of consideration given to identifying business processes stems not from a lack of insight but from wrong assumptions that are made during the requirements and design phases.

 

It’s true that the business processes may not be impacted if the scope of your mobile BI engagement is limited to mobilizing an existing BI asset (like a report or dashboard) without making any changes to the original end-product, including all underlying logic. But in many cases, the opposite is true—the mobile BI end product may be the driver for change, including the update of the existing BI asset as a result of a mobile BI design.

 

Mobile solutions may require different assumptions in many aspects of their design, which range from source data updates to report layout and logic. Advanced capabilities, such as a write-back option, will further complicate things because the integration systems outside the BI platform will require closer scrutiny and a much closer alignment with business processes.

 

Moreover, constraints that surround source data will have a direct influence on the mobile BI design. For example, if you’re dependent on feeds from external data sources, you may need to consider an additional buffer to take into account possible delays or errors in the data feed. Or, perhaps you have a new application that was just built to collect manually-entered data from field operations. If this new application was introduced as part of your mobile BI solution, the process that governs this data collection system will have a direct impact on your design because of its immediate availability. This wouldn’t have been as important before as an operational tool with a limited audience without mobile BI.

 

Mobile BI Solution May Drive Improvements in Your Business Operations

 

As part of designing your strategy or developing your mobile BI solution, you may discover either gaps or areas for improvement. Don’t worry. This is a known side effect, and it’s often considered a welcome gift because it gives you a chance to kill two birds with one stone: improve your business operations and increase the value of your mobile BI solution. However, it’s critical here to ensure that your team stays focused on the end goal of delivering on time and on schedule (unless the gaps turn out to be major showstoppers).

 

Typical examples are found in the areas of data quality and business rules. The design of a mobile BI asset—especially if it’s new—may highlight new or known data-quality issues. The visibility factor may be different with mobile. Adoption or visibility by executives often may force additional scrutiny. Moreover, adoption rates (ratio of actual users divided by total users of mobile solutions) may be higher because of the availability and convenience with mobile. As a result, mobile users may be less tolerant about the lack of quality assurance (QA) steps.

 

Business rules offer another example due to the same visibility factor. A proposed change in a business rule or process, which previously failed to get attention due to lack of support, may now have more backers when it’s associated with a mobile BI solution. Strong executive sponsorship may influence the outcome.

 

Bottom Line: Do Not Ignore Business Processes

 

It’s easy to make the wrong assumptions when it comes to business processes. It happens not just in mobile BI but in other technology projects. You cannot take existing processes for granted. What may have worked before may not work for mobile BI. Let your business processes complement your overall mobile BI strategy, and let your mobile BI engagement become a conduit for opportunities to improve your operational efficiencies.

 

Not only will these opportunities improve your business operations, but they will lead to increased adoption by increasing the trust your customers/users have in your mobile BI content.

 

What do you see as the biggest challenge when it comes to business processes in your mobile BI strategy?

 

Stay tuned for my next blog in the Mobile BI Strategy series

 

Connect with me on Twitter at @KaanTurnali and LinkedIn.

 

This story originally appeared on the SAP Analytics Blog.

Another Inflection Point


Of all the market transitions hitting the developed world retail industry these days, perhaps the one that will require the greatest industry change – and have the most defining competitive impact – will be the redefinition of product.

 

For a handful of industry leaders, it’s a key component of today’s competitive strategy.

 

For most others – consumed, as they are, by omni-channel integration and digital strategies and mountains of data – it seems to be a bridge too far.

 

At the heart of this issue is an all-too-familiar reality: physical products – at nearly all price points and in nearly all segments – have been commoditized.

 

It’s happened for several reasons. Private label goods offer equal performance at lesser price. Global sourcing enables the immediate copying and delivery (at volume) of hot trends. The internet brings a searing transparency of price and specifications. The quality gaps between good, better and best have been slimmed, even erased.

 

And whether or not multiple retailers have the same brand and SKU, many have the same category . . . and dozens have the same look.

 

The results of this commoditization are seen in average selling prices. In regular-price sell-through percentages. In the depth of markdowns it takes to clear.

 

Overabundance of consumer choice (1).jpg

A retailer can no longer merchandise his or her way through today’s competitive battles.

 

That is, with increasingly commoditized physical SKUs.

 

But there is an alternative: the rise of services in retail and the services-led redefinition of product.

 

As we look ahead, the operative definition of product will be a curated assortment of goods and services.

 

Using data-driven unique insights into customer behavior, merchants will create value through:

 

  • SKU delivery and subscription services – of everything that’s needed regularly, from milk to diapers to the moss control and bark chips I order every March;
  • SKU usage education – seminars, lessons, even tours on topics ranging from fashion advice to consumer electronics to food;
  • Health and family wellness services – and not only for pharmacies, but for grocery and mass merchandising;
  • So-called “federated” services with other brands – not only your winter-in-Florida outfit, but your flight, resort hotel and starred-restaurant reservations;
  • Home management services – ranging from care to repair.

 

Some services will be a means of locking in user loyalty. Others will create new revenue streams.

 

And it will be through this value-added approach to retailing that brands will survive and ultimately thrive.

 

It’s no surprise that Amazon has already figured this out. Case in point: Amazon Prime. This is a stunning success.

 

In 2013, Prime’s renewal rate was a remarkable 82%.1 In the fourth quarter of 2014, Prime had 40 million US members. A report released in January by Consumer Intelligence Research Partners found that Prime members spend, on average, $1,500 on Amazon, compared to $625 per year for non-members. Prime members also shop 50% more frequently than non-members.2

 

How does Amazon Prime bind shoppers to its brand so effectively? At the heart are the services that bind shoppers to the brand. The best example I know is their automatic deliveries of diapers in the right size as a baby grows. Think of it. No more late-night runs to the store.

Shopping app photo- Twin Design- Shutterstock (1).jpg

 

And read that again: no late-night runs to the store.


Brilliant.

 

OK, so what does this mean to the technology community? Why should the digerati care?

 

First of all, this service creation thing is not going to be easy. Shaping the offer is not going to be easy. Monetizing is not going to be easy.

 

It’s going to require deep, unique, tested insight into shopper behavior. Into your brand’s cohorts and personas. Into finding the leading indicators of need and demand.

 

At the foundation of this is Big Data. And moving well beyond Big Data. Into the data analysis worlds inhabited by the leaders.

 

Second of all, the delivery of the content that will enable the delivery of services will not be easy. This is going to be about enterprise architecture and data architecture and APIs that open data to the outside world and APIs that are accessed to bring the outside world inside.

 

And third of all, the staffing and training and delivering services will not be easy. Those who deliver services – and this will be a people business – will be on the go. Not tethered to an aisle or a department or a check stand.

 

The business processes of delivery will no doubt need a highly advanced level of mobile access to information and ease of use.

 

The redefinition of product? Quite honestly, it’s a redefinition of retail.

 

Get ready. It’s coming.

 

 

 

 

1 Forbes, 2014, Kantar Research 2014.


2 Consumer Intelligence Research Partners, January 2015.


*Other Names and brands may be claimed as the property of others.

Dentist-AllInOnePC-Xrays.png

Back in 1995, when I first started going to Wood Family Dentistry for dental care, they tracked patients with paper charts, took film x-rays, and documented exams and treatments manually. But one thing I’ve noticed in the 20 years that I’ve been Dr. Wood’s patient is his intense curiosity and desire to use technology to continually improve the level of care he provides at his Folsom, California-based practice.


Fast-forward to today, and their patient workflows are completely digital, they can instantly view high-definition digital x-rays, and there’s not a paper record in sight. Keith Wood, DDS and his staff haven’t stopped with those innovations, however. With the help of a portable All-in-One PC, they’ve streamlined and advanced patient care even further.

 

Convenience and comfort in the dental chair

 

In the exam room, the portable All-in-One’s large, mobile touch screen eliminates the need for patients to crane their necks to see images on the wall-mounted monitor. Now, Dr. Wood shows patients highly detailed digital x-rays and other images in the comfort of the exam chair.

 

“With the portable All-in-One, I put it right in their lap and touch, zoom, and really bring things to life,” he explained.

 

Dr. Wood also told me how having a single device that they can use anywhere in the office provides them with a tremendous convenience boost. Not only does it make it easy to access charts and information anywhere in the building, but instead of needing to make room for parents when their kids are in the exam room, the dental team can now bring the portable All-in-One to the waiting room and more conveniently discuss treatment plans.

 

Doctor-Patient-Chart-Mobile.png

Performance that proves itself

 

Dr. Wood was initially skeptical that a portable device could handle the large images and demanding applications that they use, but the performance and responsiveness of their Dell XPS 18 with Intel Core i7 processor has really impressed him and his staff. It gives them rapid access to patient files, the ability to run multiple dental applications at full speed, and the flexibility to input information with touch or keyboard and mouse.

 

“It’s super-easy to use,” Registered Dental Assistant Carry Ann Countryman reported. “You can get from chart, to x-rays, to documents super-fast.”

 

Foundation for the future

 

In addition, their portable All-in-One gives them a solid technology foundation for enabling other new technologies in their practice. They currently are exploring imaging wands that connect to the device to provide fast, 3-D dental images for dental molds. And they’re excited about possibility of adding hands-free gesture controls powered by Intel RealSense Technology sometime in the near future.


Curious how portable All-in-Ones or other Intel-based devices could change how you work? Visit: www.intel.com/businessdesktops

Filter Blog

By date:
By tag:
Get Ahead of Innovation
Continue to stay connected to the technologies, trends, and ideas that are shaping the future of the workplace with the Intel IT Center