1 2 3 Previous Next

IT Peer Network

1,508 posts

I’ve been in IT a long time, and I can say unequivocally that in all those years I’ve never seen a more exciting array of new technology with so many opportunities to integrate and benefit. The Internet of Things (IoT)—sensors, intelligent gateways and edge processing, cloud and big data analytics, real-time synergy between disparate systems—the possibilities are almost dizzying. Smart factories. Smart data centers. Smart buildings. Smart parking garages and stadiums. Smart cities.

 

But how do we get from where we are—a tantalizing vision—to reality? How do we merge so many points of view into a cohesive whole that is practical to implement? In other words, how do we bring the Internet of Things to life? In my role as product development chief of staff for Intel IT, I’ve started at the ground floor of the IoT and have had an excellent opportunity to observe and experience both its benefits and its growth pains.

 

Fig1.pngEvery Thing Needs Interoperability

I propose that interoperability is the key to making the IoT mainstream. For the IoT to really work, “things” need to talk to each other. Facility things need to talk to IT things. IT things need to talk to manufacturing things and vice versa. And if we want interoperability, we need data standards. At Intel, we are working to “break the barriers” between IoT and IT. This work includes projects with our own IT and facilities teams, and with the industry as a whole, to help establish IoT data standards. You can read about some of our early work in our recent white paper, IoT Data Standards Provide the Foundation for Smart Buildings.

 

And it isn’t just things that need to talk to other things. People, too, need to change how they communicate. There can no longer be “the facility team” and “the IT team” and the “corporate services team.” For the IoT to reach its potential, these historically separate knowledge domains need to collaborate.

 

Here at Intel, we are making great strides in these areas. For example, we’re investing in a gateway service network and are closing gaps in wireless connectivity—important preparatory work before our factories and buildings can take full advantage of the IoT. In many ways, this early work is like clearing land for a construction project—we’re planning, removing obstacles, and clearing the path so all of Intel has the necessary tools and infrastructure.

 

Let me tell you about some of the projects at Intel that are building on this work.

  • In our factories, we are conducting proactive vibration analysis—sensors gather vibration data and analysts use that data to make sure silicon wafers aren’t damaged in processing. If the sensors determine that vibrations are outside an acceptable range, an alert is sent to the line manager. This prevents damaged wafers and saves Intel money.
  • In some conference rooms, we have installed sensors that know when someone enters the room and can automatically boot up the Mini PC running the Intel® Unite™ solution (a wireless collaboration tool), turn up the heat, and turn on the lights. These sensors are also connected to an online tool that employees can use to find an unoccupied conference room—boosting productivity and user experience. In the future, we hope to combine the IoT with our “know me/sense me/free me” initiative, so that users can set their personal preferences for heat and light settings, and when they enter a room, these settings are automatically configured.

 

Intel IT is working closely with Intel’s Corporate Services group—which is responsible for setting design standards for smart buildings. They have asked Intel IT to write the IoT dashboards for their projects, and are working with us to implement data standards. In this way, we can emulate Intel’s factories, which use a Copy Exactly methodology—each factory adheres to a strict, well-defined set of standards, which result in cost savings and high reliability.

 

In another example, the Intel Smart Building and Venue Experience Center was built on the Intel® IoT Platform and recently implemented in Chandler, Arizona, and is a showcase for the IoT. This center standardized more than a dozen smart stadium capabilities to increase operational efficiency, enhance the fan experience, and provide better security. The center has helped us create a blueprint of the necessary IoT technologies.

 

So far, we’ve brought about 42 groups through the center (just a few examples are representatives from Arizona State University, Michigan State University, and SAS Institute). When the facility teams saw the IoT in action and how data standards could integrate HVAC, lighting, digital signage—even towel dispensers in the bathrooms—it was like a lightbulb going off over their heads. Their next question is always: So how do we make this happen in our world?

 

Recipe for IoT Success

In my opinion, the following ingredients are pivotal to implementing IoT solutions on any kind of scale. We are implementing all of these at Intel, even as I write this.

  • Go slow to go fast. If you just start implementing changes rapidly without appropriate planning, you will end up with many legacy systems, some in the cloud, some not, some that will talk to each other, and some that won’t. It’s critical to first consider all implications so that you don’t just create a technological mess. Take the necessary time to plan properly and lay out the proper architecture and infrastructure to support current and future states. Starting with a common set of standards and desired end states can yield results much faster than randomly implementing solutions. The goal is to focus on common use cases, achieve the benefits of data reuse, and remove the majority of interoperability issues with systems that need to talk to each other. Plan the future, then build a foundation for that.
  • Build the right platform. You need a reusable, extensible platform to build on. Start it now. Ask yourself, as you choose projects—“Am I thinking for the future or am I just putting in more things that I will just have to pull out in three years?” For the IoT to be real, solutions need to be as Plug and Play as the ubiquitous USB devices are today. The foundation you build will keep operational expenses under control and will help ensure privacy, security, and manageability. It will support many uses cases and not just the occasional one-off project you’re currently working on.
  • Choose the right projects. Identify the high-priority personas or uses that will make your organization more efficient and that can integrate later with new use cases.
  • Make sure the network is ready. Think of the network as a sensor that can support analytics and manageability.
  • Address the culture challenges. IT departments are used to being autonomous. So are operations teams. But for the IoT to work well, you need to establish a culture of collaboration. We’re using cross-team meetings and other collaboration exercises to bring everyone together to reach a common goal.

 

Here at Intel, we’re starting with data and aligning that with our enterprise strategic goals. I’m envisioning the future state and goal, then working backwards to figure out how to create a single interoperable system that can make it real.

 

What Are You Doing to Make It Real?

The IoT can be intimidating. Data standards are still evolving. There’s a lot to be done. But at Intel, we’re already using the IoT to drive efficiency and support business growth. We believe there’s significant value to be gained by using the power of the IoT.

 

I’d be interested in hearing what other IT professionals are working on in the area of the IoT—what have been your challenges? Share your successes! I encourage you to join the conversation by leaving a comment below.

WPD2016-May5-FB.jpg

No one likes passwords, but the reality is we need some form of identification and authentication to protect our digital reputation and information as well as facilitate customized online experiences.  As we celebrate World Password Day 2016, it is time to make passwords both strong and easy to use.

 

Another year passes by and passwords still remain.  At some point, someone told you passwords were going away.  They lied.  Passwords are here to stay, in one shape or another.  Although unwieldly, they are still the most prevalent means to validate a user. 

 

The key to reduce the frustration is to streamline their use while still benefitting from the protection they provide.  But there is an inherent conundrum: if you don’t use them correctly, they don’t provide much protection.  If you do use them properly, they are horribly difficult to manage and adversely slow down our digital experiences. 

 

I may be an anomaly, but the number of login accounts I have now number well over one hundred.  Most I only use sporadically, but I do need them.  As a security advocate, I know better than to reuse passwords or simply increment them in a simple way.  That would be insecure.  To be honest, I don’t have much trust in some of the domains I sign-up for.  I suspect some admins might take a peek at user’s credentials or even worse, their security practices are insufficient and my password may eventually get breached by a malicious hacker.  Either way, I expect several of my passwords to be exposed eventually.  Attackers then like to try those passwords on other accounts and look for easy patterns that the victim might be using to facilitate their ease of use.  If the threats figure it out, it is bad news.  Like dominoes falling, your accounts too will tumble and be in the hands of attackers.  They can login, steal your data, and impersonate you if they wish.  The damage can be serious enough for anyone to regret employing simple shortcuts to save time.

 

Don’t despair, there is hope.  It is time to take the sting out of password management.

 

Password Day 2016.jpg

Passwords are only protective if you use them correctly, but they don’t have to be hard to live with.  Get organized, let technology do the work for you, and follow these 4 simple rules:

1. Use strong passwords or even better, a passphrase.

Passwords are useless if they can be guessed or easily succumb to brute-force attacks.  So, make them challenging.  Additionally, when in doubt, change them.  Top web services look for suspicious patterns of activity and will notify users of a possible account breach.  Don’t ignore these warnings!  Change your passwords immediately by opening a new browser window and navigating to the site to change your password (never click on links in emails to do this).

2. Make them unique.

Never reuse the same password across different sites.  That makes it simple for attackers to compromise your entire digital life.  Furthermore, don’t make simple increments when changing passwords.  Moving from Password1 to Password2 is just asking for trouble.

3. Use a password manager.

Retire the post-it notes or spreadsheet file.  Using a reputable password manager is a huge time saver and will actually add more security into the mix.  Integrated password managers can automatically log users into websites and applications, which is tremendously convenient.  They facilitate the use of insanely strong and unique passwords, and make dreaded expiration notices a snap to deal with.  No more trying to navigate and interpret the obscure hieroglyphs as part of your secret code.  Password managers can generate ridiculously complex passwords that you never need to type in.  They can handle the brunt of all the work.  There are secure solutions out there that help take pain out of the process, like True Key by Intel Security.

4. Biometrics and multifactor authentication is better!

Biometrics can greatly reduce the frustration of logging in.  Fingerprint readers are great on phones and facial recognition on PC’s to speed up access.  Such systems are also emerging which can detect when you walk away and then lock the device.  Next generation solutions will take it a step further and unlock it automatically as you return.  Multifactor authentication schemes should be employed in high value situations, where if your password is compromised, the attacker still needs another form of authentication to proceed.  This thwarts all but the most elite types of attack and is well worth the extra effort for financial accounts and very private communications. 

 

Passwords don’t have to be hard to live with.  Get organized and let technology do the work for you.  Passwords aren’t going away anytime soon.  Reduce the loathing and inconvenience while maintain good security.  Remain vigilant and your passwords can save you from your worst digital day.

 

 

Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.  Also be sure to visit the Intel Security blog for the latest security news.

CHAD CONSTANT

The New Way To Work

Posted by CHAD CONSTANT Apr 28, 2016

man_chromebook.PNG

With the rise of today’s increasingly mobile workforce, we’re seeing a shift away from antiquated office environments. Technology advancements are evolving the traditional cubicle into mobile work stations where employees use their device of choice while no longer being chained to their desk. This has been a contributing factor to a massive wave of innovation we are seeing with some of the most well-designed business devices that eliminate physical work barriers of the past.

 

One of the latest and greatest examples of this is the recently introduced HP Chromebook 13 powered by the 6th Generation Intel® Core™ m processor. Customers considering Chromebooks get a range of productivity and mobility they need to work more efficiently for today’s office environment. A couple of things to highlight from a mobility perspective: with our 6th Generation Intel Core m processor inside, it means that employees can enjoy a long battery life, up to 10 hours, if they’re working on the go. It also means that they can have a great experience multitasking moving quickly across apps and browsing. HP and Google also achieved a first as this the thinnest and lightest Chromebook on the market measuring in at 12.9 mm when closed and weighing 2.86 pounds.

 

On the heels of our recent launch, businesses of all sizes are reacting positively to our portfolio of businesses solutions—from Intel Unite for conference room collaboration, to Intel Core and Intel Core vPro for business, these solutions are facilitating a new and better way to work. The HP Chromebook 13 is a great example of the continued innovation we’re driving with our partners.

 

Check out HP’s announcement and Google’s blog to learn more.

 

~Chad

As the variety of IT equipment, infrastructure, and facilities equipment has increased, so has the number of data center management products. Over the years, an ideal goal was to have a management solution that would integrate IT equipment management and facility management—resulting in the centralized monitoring, management, and capacity planning of a data center's critical systems.  This concept is referred to as data center infrastructure management (DCIM).

 

Historically, IT and facilities have worked in separate silos. A DCIM solution can enable those groups to work together more closely to satisfy the needs of the business. Intel developed a data center management solution focused on power and thermal management, which is named Intel® Data Center Manager (Intel® DCM). This solution can be either integrated into a third-party DCIM console or used as a standalone tool.

 

The three physical resources that have the biggest impact on data center costs are power, cooling, and floor space. With Intel DCM, Intel IT has successfully collaborated with Intel’s facility management teams to manage these costs.

 

blog.jpgHere is a typical example of this new collaboration using Intel DCM: One day, an Intel data center operations manager received overheating alerts for several servers. The manager checked the Intel DCM console to locate the hot servers. The building’s facilities team had not seen anything abnormal on their monitors.

 

The data center operation manager found that an air conditioning unit had been accidentally powered off (therefore, it did not trigger a facilities alert). The unit was turned on, and the hotspot was fixed. Without Intel DCM’s rack- and server-level granularity of thermal data, the servers might have shut down, resulting in costly downtime.

 

In another instance, Intel DCM data in a data center revealed some hotspots. Facilities added another CRAC unit to increase the volume of cold air in the problematic area. Facilities also boosted airflow in the problem area by increasing the square-footage coverage of perforation tiles in the area from 22% to 50%.

 

By using Intel DCM, Intel IT can now monitor IT equipment and coordinate changes with facility managers across the major systems. Intel DCM is increasing operational efficiency and productivity and reducing human error in our data centers.

 

Data center management is an evolving and challenging field. I’d be interested in hearing what other IT professionals are doing to better manage their data centers and if you’re using Intel DCM. Please share your success stories and pain points. I encourage you to join the conversation by leaving a comment below.

As an enterprise organization we are facing a tremendous amount of pressure to move fast, enable the business, and address security threats. I am often asked what is different now than in previous years—we have always had technical change, there have always been security concerns, and we have always been asked to move faster and lower costs. The difference is the increase in pace in each one of these areas and the technical capabilities that are available to address them.

 

Previously, we standardized the client platform and determined the best applications to enable the business. Follow-on work was monitoring the platforms and applications for security and availability, and fine tuning performance. In essence we controlled the computing environment of our users and invested in helping to give them the best solutions possible at the right cost. The days of IT controlling the computing environment are quickly fading. We are now asked to support multiple client computing platforms and even promote it with Bring Your Own Device (BYOD) initiatives. With BYOD, we have seen increased consumer expectations for user experience and content availability. Advancements in cloud architectures are making applications and content available as needed, but this also brings greater security risks.

 

ApplicationSavings.pngTo address the quick time to market for applications and content, we implemented a strategy of using commercial applications through software as a service (SaaS) and developing internal applications through platform as a service (PaaS). We needed to provide a capability that enabled us to move with velocity, connect to internal systems and data, and provide the appropriate level of security and controls to protect our information. We have named our PaaS solution the Intel Cloud Application Platform.

 

The Intel Cloud Application Platform provides application and database services so developers can just focus on business functionality and code deployment. We have seen an enthusiastic organic adoption from our development teams with over 800 applications in development and 73.6 percent cost savings compared to infrastructure as a service (IaaS). With the Intel Cloud Application Platform, applications are developed and launched in days, instead of weeks. The adoption of the Intel Cloud Application Platform is minimizing our security risks with cloud-based solutions. This common development platform has enabled us to implement security best practices for environment configuration and management, code scanning, and identity and access control. The security best practices get integrated into new applications and existing solutions that are upgraded on the Intel Cloud Application Platform.

 

The Intel Cloud Application Platform has proved to be a critical capability for our developers, Intel business, and our information security teams as we adapt to the changing environment. I’d be very interested to hear about how my IT colleagues’ organizations are utilizing PaaS to increase velocity and agility—join the conversation by leaving a comment below!

The financial services sector is not known for making hasty decisions. But the industry is at a critical public perception turning point. Institutions aren't responding fast enough to competition from startups offering new digital services. Intel is already working in the industry to enable scalable standards for technology and security. This blog will highlight three tech segments in particular that will push traditional banking further into the digital age: blockchain currency, the Internet of Things (IoT), and cloud storage.


25891344_ml.jpg

Blockchain: Regulation, Security, and Privacy

 

Blockchain currency, or cryptocurrency, is a decentralized, replicated peer-to-peer finance network capable of recording transactions. It's poised to transform payment processing by eliminating opaque and inefficient back-office systems. Major financial players like Visa, Nasdaq, Citi, and Capital One invested in blockchain currency in 2015.


And yet, questions of regulation, security, and privacy continue to surround the implementation of blockchain. Intel is working to implement trusted execution environments on its hardware chips to enhance security and privacy for blockchain users. Further, Intel is looking at developing highly customizable blockchain solutions that can be scaled up to millions of nodes for large-scale use-case scenarios. The first real use-case for blockchain technology would most likely not be used for FinTech, but instead in scenarios such as in-game credits for online games.

 

The Internet of Things and Money


In the push to adopt cutting edge FinTech solutions, financial services can’t ignore the Internet of Things. Indeed, there have been many failed ideas to come out of this new trend, but there’s also the potential for revolutionary technology. This technology could change how customers use and interact with financial services brands. IoT can connect customers to businesses in new ways. It can also empower them to make better financial decisions throughout their lives.

 

But the fact that so few established enterprises have yet to adopt these new technologies isn’t a huge surprise. The biggest hurdles to overcome? Regulation, compliance, and customer engagement in a changing environment, and the constant influx of new competition.

 

As the industry evolves, brands should concentrate on delivering customized and consistent experiences across all channels. Adopting these new methods requires a change in the very operating model of financial services. Brands will need to develop aggressive strategies that proactively participate in and lead digital disruption — rather than waiting and reacting as they’ve done in the past. This will help traditional financial organizations innovate at a pace that keeps them competitive.


In the Cloud


In the digital service economy, a key limiter for financial institutions is the data center infrastructure. Financial services have evolved beyond an era of servers dedicated to a single workload or department. Now, virtualized servers share infrastructure across departments. But the next shift is focusing on agile and efficient service delivery through cloud storage.

 

The consumer space jumped on cloud solutions fast. Financial services have held back, fearing potential security breaches. Replacing legacy architecture and IT silos with the cloud is helping banks get a 360-degree view of customers and their enterprise. These measures are improving operational efficiency, enhancing customer engagement, and supporting compliance and risk management.

 

The financial services sector is finally waking up to the innovations that startups have been creating for years. And old financial enterprises are finally ready to tap into this collective innovation. But it’s not that easy.

 

FinTech is disruptive in both a collaborative and competitive way, leaving traditional financial services with a lot of catching up to do. The good news is that these FinTech startups drive a culture of innovation the industry can leverage to deliver value in new ways. Intel is helping them do just that by delivering solutions for financial organizations to make this shift as easy as possible.

Discussion2.jpg

It is important and difficult to stay current with relevant issues in our industry.  Cybersecurity is furiously changing, fast in its pace, and rising in global importance.  Professionals must not only keep abreast of what is happening today, but also what is emerging on the horizon and heading our direction.  Security becomes stronger when professionals collectively explore ideas and actively collaborate on developing better practices.  As a cybersecurity strategist, my eyes are fixed on the future risks and opportunities.  Here is my list of what we all must be learning, discussing, and deliberating about now, so we can be prepared for what lies ahead.

 

 

Integrity Attacks will Rise to be the Next Wave in Cyber

One constant in cybersecurity is the continual rise in sophistication and creativity of the threats.  We are seeing the beginnings of a fundamental expansion to attacker’s techniques.  Integrity compromises will rise and join the more familiar Confidentiality (ex. Data Breach) and Availability (ex. Denial-of-Service) attacks.  Integrity attacks undermine the trust of transactions and communications.  Ransomware, Business Email Scams, and financial transaction fraud, are all growing examples of integrity compromises.  This third-wave will drive significantly greater impacts due to their nature, the lack of available security tools, and weak processes to manage the risks.  We are already witnessing savvy attackers making hundreds of millions of dollars in a single campaign and will likely see a billion dollar heist by the end of the year.  Everyone is at risk.

The Great Bank Robbery: Carbanak cybergang steals $1bn from 100 financial institutions worldwide

• $2.3 Billion Lost to CEO Email Scams: FBI Warns of Dramatic Increase in Business E-Mail Scams

• Bangladesh Bank Hack: How a hacker's typo helped stop a billion dollar bank heist

 

 

IoT Security:  Where Digital Life-Safety and Privacy Issues meets Consumers

Our insatiable desire to integrate technology with our lives is changing the equation of security and safety.  With the growth of the Internet of Things (IoT) devices going from 15 billion to 200 billion by 2020 and the focus by attackers to get more access to critical capabilities, we may be unwittingly handing life-safety controls to the cyber threats.  Such devices capture our conversations, video, health, activities, location, conversations, relationship connections, interests, and lifestyle.  Will personal discretion and privacy survive?

 

IoT security is a huge and complex topic in the industry, earning the attention of everyone from researchers to mainstream media.  Although transportation, healthcare, critical infrastructure, and drones are capturing most of the interest, connected devices and sensors are destined to be interwoven throughout businesses and across all walks of life.  The benefits will be tremendous, as will the accompanying risks.

• Growth of global IoT Security Market To Exhibit 55% CAGR As Threat Of Security Breaches Rises

Trust and security fears could hold back the Internet of Things

IoT and Privacy: Keeping Secrets from your Webcam

Police called after 'drone' hits plane landing at Heathrow

 

 

Why Ransomware will become the next scourge of security

The rise of ransomware is phenomenal, fleecing hundreds of millions of dollars from consumers, businesses, and even government agencies.  This financial windfall for cyber criminals will fuel continued innovation, creativity, and persistence to victimize as many people as possible.  It has found a soft spot, taking advantage of human frailties while targeting something of meaningful value to the victim, then offering remediation at an acceptable price point. This form of extortion is maturing quickly, exhibiting a high level of professional management, coding, and services.  Ransomware is proving very scalable and difficult to undermine.  It will surely continue because it is successful.  Can it be stopped?  How can everyday people and businesses protect themselves? Will security solutions rally? What will we see next in the rapid evolution of ransomware?

• Cyber Threat Alliance report: Lucrative Ransomware Attacks - Analysis of the CryptoWall Version 3

• US Computer Emergency Readiness Team: Ransomware and Recent Variants

Hospital Declares ‘Internal State of Emergency’ After Ransomware Infection

 

 

What are the Hidden Long Term Impacts of Cybersecurity?

The industry looks at cybersecurity as a series of never ending tactical issues to be individually addressed.  This is a symptomatic perspective, when the reality is a systemic problem.  The real impacts of the future are hidden from view and are staggering.  It is time we mature our perspectives and see the strategic problem and opportunities.  Estimates range from $3 trillion to $90 trillion dollars of global economic impact by 2030.  We as a community must understand the scale of the challenges and how addressing security in a tactical manner is simply not sustainable.  This is becoming a deep intellectual discussion topic among cyber strategists.  How do we change the mindset from short-term expensive fixes to a long-term effective treatment at a holistic level across the ecosystem?

The Hidden Costs of Cyber Attacks

Cybercrime may cost$2 trillion by 2019

$90 trillion dollars cyber impact for one scenario affecting the global benefits of Information and Communications Technologies by 2030

3 Trillion Aggregate economic impact of cybersecurity on technology trends, through 2020

 

 

The Battle for Security Leads to the Hardware

Stack.jpgAs attackers evolve, they get stronger, smarter, and more resourceful.  It has become a cat-and-mouse game between the threats and the pursuing security capabilities.  The trend is for attackers to move further down the technology stack.  Each successively lower lever affords more control and the ability to hide from the security above.  The most advantageous position is in the hardware, where the root-of-trust originates.  The race is on.  Advanced researchers and attackers are looking to outmaneuver security by compromising hardware and firmware of devices.

 

Traditional defensive structures must also advance to meet the new challenges.  Security features embedded or enhanced by hardware can be incredibly powerful to support effective defenses and visibility, even against the most advanced attacker.  Control of hardware and firmware will play an ever greater role in protecting technology and users.  Who will win?

The hardware roots of trust

Attackers Seek to Hack Hardware for Ultimate Control

Security on Silicon the Next Big Step in Cyber Protection

 

 

Job Crisis in Cybersecurity

Cybersecurity is in dire straits.  There is not enough talented security professionals to fill the need.  In a few years, there will be an estimated 1 ½-2 million unfilled cybersecurity positions.  This will have a catastrophic effect on securing people and technology.  Organizations have two problems; finding candidates to fill open positions and retaining the professionals they currently have from lucrative competitive offers.  The disparity between supply and growing demand drives up salaries, spurs aggressive headhunting, increases the costs of security operations, limits the overall comprehensiveness of shorthanded teams, and artificially extends the windows of opportunity for attackers.  It’s like trying to play competitive soccer without a full team in the field. 

 

The best way to correct the problem is to address the supply side of the equation.  More cybersecurity professionals are needed.  Long term, only academia can save cybersecurity and they are struggling to retool, to sufficiently prepare the next generation of security professionals.  Until then, this problem will affect every organization who needs security staff, potentially for years to come, and may drive up the use of Managed Security Service Providers (MSSP’s).

• The Center for Cyber Safety and Education report: 2020 predictions expecting the shortfall of information security positions to reach 1.5 million

One Million Cybersecurity Job Openings In 2016

Higher Education Must Save Cybersecurity

Job Market Intelligence: Cybersecurity Jobs 2015 report published by Burning Glass Technologies

 

 

Cybersecurity Predictions for 2016

Top 10 Cybersecurity Predictions for 2016.jpgA slew of expert predictions is now available from a variety of sources.  They typically come out by the end of the first quarter and although some are better than others, all of them provide perspectives for 2016 and beyond.  Peering into the future of cybersecurity provides valuable insights around the challenges and opportunities.  The industry is changing rapidly and attackers seem to always be one step ahead.  Take advantage of what the experts are taking about, but beware some are trying to sell you their wares.  Understand how anticipated trends will affect your organization, customers, and partners in the industry.  Plan how you can adapt to find a sustainable balance in managing the security of computing capabilities and technology.

Intel Security McAfee Labs 2016 Threat Predictions whitepaper 

Top 10 Cybersecurity Predictions for 2016 and Beyond

The Top 16 Security Predictions from Companies and Magazines

 

 

How versed are you in these topics?  I believe they will have far reaching repercussions and every cybersecurity professional should understand these areas.  Those who benefit from the insights of the future, can be better prepared to adapt to the changes.

 

 

 

Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

Imagine a device that could fit in the palm of your hand that allowed you to stream content, play light games, and check your social media or email on any HDMI television.  That would be pretty convenient, wouldn’t it? Welcome to the new Intel®-based compute stick.


Get More Out of Your TV

Compute_Social_TwitterStick.png

 

From living rooms to hotel rooms, big screen HDTVs are practically everywhere. But what if your television could do more? With a new compute stick—featuring an Intel® processor, built-in Windows 10 OS, on-board storage, Wi-Fi, and Bluetooth, you can stream content, access rich media, play light games, and even check your favorite social media sites on the biggest screen in the house. Picture richer experiences and limitless entertainment options as an extension of your regular TV, which also lets you stay connected to the rest of the world.

 

“With the compute stick, there are virtually no limits to what you can get on your TV as far as the type of entertainment you want to access,” said Xavier Lauwaert, category manager at Intel Client Computing Group. “It’s one of the rare devices that enables you to easily connect online and start watching your favorite movies or soap operas, anywhere in the world, from the comfort of your couch or your hotel room.”

 

Given its size, you can also take your personal content with you and safely store your personal files locally with built-in storage and a Micro SD slot so always-on Internet is not required to enjoy your content.

“I like to say a Compute stick gives your TV a college degree and your smart TV a PhD. As long as it has a website, you can access almost anything, anywhere,” Lauwaert adds.

 

Enrich Streaming and Gaming

Computer_Social_TwitterHero.png

 

The compute stick lets you access the full universe of streaming entertainment services that you can watch on any web browser. Fans of Netflix or YouTube can live stream movies, music, or video. That’s pretty useful if you’re on the road and need to catch up on the latest zombie apocalypse drama or have free time at home to binge watch an entire season.

“As a Windows device, you can even access premium content through iTunes,” adds Lauwaert. “It’s the only non-iOS device that enables access to that type of content on something that’s smaller than the size of your palm.”

For light gamers, the compute stick connects easily to the Windows Store, which boasts a pretty exhaustive library of Windows 10 titles such as Minecraft: Windows 10 Edition Beta, Age of Empires: Castle Siege and Angry Birds. More and more titles are added to the store every day, so now, consumers can enjoy light gaming in any room of the house and rack up those Xbox Achievements along the way.

 

“Windows 10 includes a dedicated Xbox app, which is the gateway to the Xbox Live ecosystem,” Lauwaert said. “Whether you own an Xbox One or not, the app provides a tool to connect gamers and sharing content through the compute stick. For example, gamers can also record and clip their most epic gaming moments with built-in Game DVR and then share them with their Xbox Live friends.”

 

Multitask MassivelyCompute_Social_TwitterContent.png

 

Stream content, check email, browse the web, and stay connected on social media—all at the same time and on the same huge HDTV screen. The Intel®-based compute stick with Windows 10 includes a feature to split your TV screen or monitor so you can multitask to your heart’s content.

 

“One of the cool things about Windows 10 is the updated Snap Assist feature,” adds Lauwaert. “You can snap a window and resize it to two-thirds of your TV screen for the content you’re watching, and then snap a second window to automatically fill in the available space for your social media profiles.”

 

“For example, anyone can watch the national basketball tournament on one side, and on the other side, you’ve got your Twitter feed so you can follow the online conversation about the game.”

 

If you’re ready to get more out of your HDTV, download this handy Flash Card to learn more.

 

Industries around the world rely on new technologies to enhance their capabilities and competitive advantages. This is true for the oil and gas industry, where high-performance computing (HPC) is essential to discover new natural resources using exploration geophysics. As oil and gas exploration is competitive, risky, and very expensive, the accuracy and efficiency of seismic surveys using technology and HPC simulation is therefore critical to reduce business risk and operation cost. Imagine the business impact of reducing the timeframe of acquiring computation results of seismic simulation from 2 months to just a couple of days. I recently came across such a case with China National Offshore Oil Corporation (CNOOC).


15868083_ml.jpg

CNOOC is China’s largest offshore oil and gas producer whose business is focused on searching for large- and medium-sized oil and gas fields offshore. To find new oil and gas resources, CNOOC relies on seismic data from offshore acquisition vessels that record high-resolution echo from sound waves bouncing off the sea floor.

 

As it gets deeper into its exploration activities, however, the size of CNOOC’s seismic data grows as well, with more projects and higher complexity. A single seismic project, for example, may involve over 100TB of data. CNOOC needed a more efficient, scalable, and high-performing storage system to manage and transfer large amount of data as well as ensure efficiency in collecting, recording, processing, and interpretation of seismic data.

 

To solve its data storage woes and enhance its oil and gas exploration capabilities, CNOOC evaluated different options before deciding to deploy a solution based on open-source software and industry-standard high-volume servers. It built large storage clusters with Intel® Enterprise Edition for Lustre* software, Intel® Xeon® processor E5-2600 v3 product family-based servers, and Intel Server Adapter X520 family, which provides 10Gb Ethernet connection for the storage cluster.

 

Lustre* is an open-source, parallel file system designed for HPC needs. A Lustre* file system can be scaled to multiple storage clusters with thousands of storage nodes, which makes it very suitable for HPC applications and supercomputers. The Lustre*-based solution not only met CNOOC’s storage needs but also reduced cost of performance and capacity expansion compared to its old system. As the solution comes with Intel® Manager for Lustre* software, CNOOC found it easier to facilitate Lustre* deployment, expansion, and management.

 

Thanks to this solution, CNOOC improved storage performance by 4.4 times while allowing it to leverage a unified storage service that simplified and centralized services for HPC projects and increased utilization rates of storage facilities and network bandwidth. Computing results that used to come out in 2 months can now be obtained in just a matter of days.

 

If you want to know in detail how CNOOC enhanced its storage system for improved seismic data processing, you can read the complete case study here.

Unsubscribe email risks.jpg

We all receive loads of unwanted email solicitations, warnings, and advertisements.  It can be overwhelming to the point of being obnoxious.  Some days it feels like an unending barrage of distracting deliveries which requires a constant scrubbing of my inbox. 

 

Beyond being frustrating, there are risks.  In addition to the desired and legitimate uses of email, there are several shady and downright malicious uses.  Email is a very popular method for unscrupulous marketers, cyber criminals, and online threats to conduct social engineering types of attacks.  Spam, phishing, and fraud are common.  Additionally, many attackers seeking to install malware will use email as a delivery mechanism.  Electronic mail can be an invasive communication mechanism, so care must be taken. 

 

Unfortunately, like most people, I tend to make my own situation even worse.  In my professional role, I devour a tremendous amount of industry data, news, and reports to keep on the pulse of change for technology and security.  This usually requires me to ‘register’ or provide my email address before I get a ‘free’ copy of some analysis I desire.  I could just give a false email, but that would not be ethical in a business environment.  It is a reasonable and expected trade, where both parties benefit.  I get the information I seek and some company gets a shot at trying to sell me something.  Fair enough, so I suffer and give my real work email.  In this tacit game, there is an escape clause.  I can request to no longer be contacted with solicitations after the first email lands in my inbox.  Sounds simple, but it is not always that easy.

 

The reality is I receive email from many more organizations than I ‘register’ with.  Which means someone is distributing my electronic address to many others.  They in turn repeat and now the tsunami surging into my inbox gains strength.  I become a target of less-than-ethical marketers, cyber attackers, and a whole lot of mundane legitimate businesses just trying to reach new customers.

 

Some include an ‘unsubscribe’ link at the bottom which holds an appealing lure of curbing the flood of email destined for the trash anyways.  But be careful.  Things are not always as they seem.  While attempting to reduce the load in your inbox, it might actually increase the amount of spam, and worst case you could be infecting your system with malware by clicking that link.  Choose wisely!

 

Recommendations for using ‘unsubscribe’:

Rule #1: If it is a legitimate company sending the email, use the ‘unsubscribe’ option.

Make sure the link points back to a domain associated with the purported sender.  Legit companies or their marketing vendor proxy will usually honor the request.

 

Rule #2: If it is a shady company do not ‘unsubscribe’, just delete.

If your mail service supports it, setup a BLOCK or SPAM rule to automatically filter future messages for these.

 

If it is seriously malicious, the ‘unsubscribe’ link may take you to a site preconfigured to infect or compromise your system.  This is just another way bad guys get people to click on embedded email links.  DON’T FALL FOR IT!  It may result in a possible malware infection or system compromise.

 

If it is semi-malicious, like a spam monster who will send mail to any address they can find, then clicking the ‘unsubscribe’ link actually tells them this is a valid email address where someone is reading the mail.  Which is valuable for them to know as they can sell that email address as ‘validated’ to others and use it for future campaigns.  End result: more spam.

 

Rule #3: Some spam and solicitations don’t offer any ‘unsubscribe’ option.

Just delete.  Probably not a professional company you want to patronize anyways.

 

If you are in a work environment, be sure to know and follow your corporate policies regarding undesired email.  Many companies have security tools which can inspect, validate, or block bad messages.  Additionally, they may have solutions which leverage employees reporting of bad email to better tune such protections. 

 

Just remember, if you are not sure the email is legit; don’t open or click anything, and NEVER open any attachments, including PDFs, office documents, HTML files, or any executables.  Only open attachments from trusted sources as they can be used by attackers to deliver Trojans which may infect your system with malware, ransomware, or other remote manipulation tools.  Cybercriminals often look like real companies with real products.  Make email life easier by ‘unsubscribing’ with care and necessary forethought.

 

 

Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

business-team-tablet.jpgWhen you design for mobile analytics, the response time (sometimes also referred to as load time) will always come into question. The concept of fast loading pages is nothing new. Plenty of material exists on the web that covers everything from designing web sites to developing apps.

 

Every so often I am presented with this question: What is an acceptable response time for an mobile analytics asset? And my answer—“it depends”— generally causes confusion. This is often because the person asking the question is looking for a magic number of seconds that’s regularly quoted on the web by different surveys or studies.

 

But to provide consistent results can be more challenging than it may appear. There are several factors that will ultimately dictate our success. We can manage some of them through best practices. Others will be completely out of our control.

 

Let’s take a look at several pieces that may come into play.

 

Mobile network presents challenges for response time

 

Mobile analytics inherits many of the same challenges that exist with data-driven apps, which are at the mercy of the mobile networks that they run on. In a previous blog post, I referred to this network as the “digital bridge.” People frequently forget that this digital bridge can be made up of multiple networks and can lead to unreliable bandwidth.

 

In order to manage the challenges related to mobile networks, consider:

  • Displaying warning messages when a slow connection is detected. This is a common method used with many mobile apps that heavily rely on live wireless connections
  • Providing testing tools that your users can use to obtain objective feedback regarding connectivity, especially in remote locations
  • Promoting the offline feature to minimize the use of wireless connections for assets with less frequent data updates

Data vs. page load

 

Typically, the response time in traditional mobile analytics implementations is made up of two main parts. First is the data load, which includes the download of results from the database query. Second is the page load, which includes the necessary elements to display on the page.

 

In order to manage the challenges related to both pieces, consider:

  • Minimizing the query time by tuning the database objects. If the database query takes 60 seconds to return the results on your server, it’s not going to magically take less time on the mobile device.
  • Developing mobile-ready assets (reports and dashboards) with highly focused target deliverables instead of saving desktop reports that are designed for downloads to spreadsheets
  • Designing or leveraging options that can use advanced features to minimize multiple requests for the same report

Adjust based on your audience

 

One of the reasons why I always say that “it depends” is because the concept of response time is dictated more by user expectations or perception and less by the actual time it takes for the page to load. The terms fast and slow are relative depending on your audience. What may be acceptable to one set of users may not be to another.

 

In order to manage the perception, study the target user audiences to match the solution with their expectations:

  • With roles that are customer-facing and demand frequent travel, such as executive teams or sales, the expectation may be in single digits—especially if the solution is required for customer interactions
  • Users with analysts or research roles may be more tolerant due to their past experience and type of work

 

Bottom line

 

Unless you are taking advantage of new technologies, such as in-memory computing, you’ll need to successfully manage response time. Your users’ perceptions, which are largely influenced by their daily (and sometimes less complicated) experiences with their mobile devices, will have a direct impact on the success of your solution.

 

Stay tuned for my next blog in the Mobile Analytics Design series.

 

You may also like the Mobile BI Strategy series on IT Peer Network.

 

Connect with me on Twitter @KaanTurnali, LinkedIn and here on the IT Peer Network.

 

A version of this post was originally published on turnali.com and also appeared on the SAP Analytics Blog.

The security industry is changing.  Technology innovation is eroding the distance between the roles and responsibilities of traditionally independent physical and cyber security teams.  Modern physical security tools now rely heavily on networks, clouds, firmware, and software which puts them at risk of cyber exploitation.  Computing devices, no matter how well managed, are largely vulnerable to physical attacks.  The biggest convergence between these two worlds is coming from the rapid growth and adoption of Internet of Things (IoT), which extends access, control, and people-safety issues to users and businesses.  Transportation, critical infrastructure, healthcare, and other industries currently rely on strong physical controls.  More and more, they will also require the same benefits from cybersecurity to achieve the common goals of protecting people’s safety, property, and business assets.  In the highly connected world of the near future, attacks against both physical and cyber targets will originate from far across the digital domain.  Convergence is coming. 

 

At this year’s 2016 ISC West conference, one of the largest security conferences with over 1000 exhibitors and brands, the organizers took an aggressive step which showed their insights to the future.  The Connected Security Expo, a sub-conference with separate tracks, was established and began its inaugural year, bringing together for the first time both physical and cyber security professionals at ISC West.  I was honored to deliver one of the two keynotes to a combined audience of security leaders who recognize the inevitable intersection of security.  

 

Organizations must address the combined physical and cyber threats which they will face.  Leaders require insights into the complex future of cybersecurity, both the challenging risks and equally lucrative opportunities, which will emerge as cyber-threats maneuver over time.  In my presentation I discussed how cybersecurity is similar to its physical counterpart, as a difficult and serious endeavor, and strives to find a balance in managing the security of computing capabilities to protect the technology which connects and enriches the lives of everyone.

 

The 2016 Future of Cyber Security presentation showcased the cause-and-effect relationships, provided perspectives of the forthcoming challenges the industry is likely to face, and how aligned security can be better prepared to manage it.  A number of other notable speakers, including Mike Howard, CSO for Microsoft, shared insights with the audience.  Herb Kelsey, Chief Architect at Guardtime, and Nate Kube, Founder and CTO of Wurldtech a GE Company, also presented a keynote: “Reducing the Time to Detect Tamper – Physical Security’s Mission Against Cyber Threats”.  They discussed the benefits and risks of the connected world, from power stations to light bulbs and everything in-between.  The unintended consequences will include bad actors using technology against, instead of for us.  The speakers partnered to showcase the future trends and technologies in securing the promise of the Internet of Things. 

 

I look forward to next years the Connected Security Expo as the audience ranks will continue to grow.  Speaker topics, threats, and the synthesis of technology will be even stronger.  I expect other conferences to start down the same path, in an attempt to catch up with ISC West.  It makes sense as the convergence between physical and cyber security will continue to gain momentum.    

 

 

Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

snackable-startingfromsquareone.pngAs a Manufacturing IT Principal Engineer, I have helped Intel’s factory management evolve over the last two decades from manual processes toward the goal of being 100 percent automated. In our recent white paper, “Using Big Data in Manufacturing at Intel’s Smart Factories,” we describe the crucial aspects of Intel’s continuing automated manufacturing journey and the benefits automation—accelerated by the Internet of Things—has brought to Intel. These benefits include reduced cost, accelerated velocity, world-class quality, and improved safety.

 

The automation of manufacturing processes has now achieved global momentum, through stratagems such as Germany’s Industry 4.0 concept and China’s “Made in China 2025” initiative. Companies in Europe, China, and elsewhere around the world are endeavoring to automate their manufacturing processes.

 

But as I travel the world discussing Intel’s key learnings about automated manufacturing with customers and partners, I find that many companies are starting from square one. While they are interested in the advanced techniques Intel is using, such as automated material movement, IoT integration, real-time capabilities, virtualization, and interoperability with our suppliers, the manufacturing processes at these companies are still very much based on human intervention, pen-and-paper tracking, and manually operated production lines. Advanced automation is well and good – but how do they get started?

 

To help answer that question, in this blog I’ll cover the basics of factory automation – those critical components that must be in place as a foundation before more advanced practices can be achieved.  I hope this information will help you as you evaluate your own journey.

 

So, what are smart factories made of?

 

Manufacturing Execution System (MES)

 

The first ingredient in a smart factory is the MES. This is the heart of factory automation through which all the factory transactions flow; it is the gatekeeper for equipment and material states. The MES, which can be developed internally but is more often purchased from a third-party supplier,is a transactional system for managing and tracking the state of equipment and work-in-process on a factory floor. As parts move physically on the factory floor, they move logically through the MES. An MES keeps track of information in real time, receiving up-to-the-minute data from various sources, which can include employees, machine monitors, equipment host controllers, and even robots. An MES can enable leaner operations and increase product quality through standardized workflows.

 

When evaluating an MES, a company should consider several factors.

 

For the MES itself, high performance is paramount. The MES architecture should be re-entrant and multithreaded to support a high level of parallelism. (If these terms are unfamiliar, keep reading, and I’ll explain what they mean.) This is because the MES must track and execute transactions for different parts of the factory at the same time. In today’s brutally fast-paced business environment, waiting for information to come through a queue can significantly erode a company’s competitive edge. Other important attributes of an MES include a fully redundant architecture, the availability of remote interfaces, easy access to APIs for customization, and user interfaces that support configuration and manual intervention in processes when necessary.

 

The MES database is usually determined by the MES vendor, so it is an important part of the MES evaluation process. Again, don’t skimp on performance – automation can put a heavy load on the database, and the database must be able to scale as automation increases. Similar to the MES, look for a database that is highly parallel. For example, some databases lock the entire table when an update is occurring, while others are more flexible and lock only the row that is being updated.

 

Another important MES and database consideration is support availability. It’s best to find out what others in the industry are using, and choose an MES and database that are well supported by the vendor(s) for the operating system in use. Going with an obscure choice, or a combination of MES, database, and operating system that is not popular in the industry could leave you on an island with the esteemed privilege of uncovering many of the bugs and issues that exist in the system without having a fast path to fixing them.

 

Middleware

 

Think of the middleware as the mail carrier for factory automation, who also speaks every language on the planet. Middleware serves as an isolation and translation layer between the entire install base of automated capabilities. It enables you to reconfigure or swap-out factory automation systems without having to make changes to any of the other automation components. It also supports transaction aggregation, which is useful in many equipment transaction situations such as moving a carrier of production material into or out of a process operation.

 

When a factory transaction executes, the middleware abstracts this transaction and translates the single “action” into many separate actions. For example, it could start actions in the MES and scheduling system, set a flag in the quality control system, and publish data to a repository.

 

Middleware usually handles every transaction and routes every message in the factory. Therefore, evaluation of the available middleware solutions with high performance and parallelism should be similar to the MES/database evaluation process.

 

PCs at Each Equipment Station

 

snackable-nextgen.pngPCs are the ears, eyes, and mouth of factory automation. They communicate with the equipment, control that equipment, and send that information back to the factory automation systems like the MES. At Intel, we use PCs equipped with high-end processors, because performance is critical to keeping Intel’s factories running at peak capacity. High-performance PCs are necessary because the PC may communicate with a production tool many times per second, with hundreds of data variables in play – and data comes back at that same rate – providing real-time data that powers quality decisions. (Some factories that focus on assembly of products may not need this level of monitoring, but the semiconductor industry certainly does.)

 

In addition, the PC must communicate simultaneously with the MES and the equipment. Therefore, it must be capable of multiplexing (the simultaneous transmission of several messages along a single channel of communication). The PC also serves as the operator interface for controlling equipment (start, stop, change parameters).

 

Connections between the PC and the equipment are defined by the equipment in use. Many semiconductor factories use Ethernet connections with SEMI standard protocols, but connections could also use Wi-Fi. Other factories might use serial interfaces, Modbus, CAN bus, EtherCAT, or one of many other options.

 

Additional Factory Automation Components

 

Several backend components comprise the rest of the automation system. These include, but are not limited to, the following:

  • Statistical process control system
  • Yield analysis system
  • Scheduling system

When a company is ready for more advanced automation, they may add in automated material handling through robotic delivery (methods include overhead hoist, ground-based transport, and rail-mounted transport). A material control system is required to maintain the state of the automated vehicles and to provide instructions and synchronization.

 

Putting it All Together

 

If it seems like there’s a lot to factory automation, you’re right. That’s why creating a reference architecture for computer integrated manufacturing is important before beginning implementation. Because Intel has decades of experience in factory automation, we’re documenting our journey with the goal of helping fellow travelers create their own factory automation architecture.

 

In the next few months, I’ll be posting additional blogs talking about more advanced aspects of smart factories, the industrial Internet of Things, edge computing, and how Intel is putting technology to work in our manufacturing facilities. In the meantime, I’d love to hear your challenges and success stories – please leave a comment below and join the conversation.

In mid-March, tech innovators from all over the world gathered for the 5G Summit in Taipei to discuss the future of mobile network technology. This conference was significant for the tech community, as it’s poised to change not only mobile technology, but how we conduct business and connect with each other. It’s nearly impossible to overstate the impact 5G will have. That doesn’t mean the switch will be easy, though. Quite the contrary. Here are my three main takeaways from the summit, and what they mean for the future of your tech.


5G Summit Taipei image.jpg

1. Taiwan’s role in the future of 5G

 

Deciding to hold the conference in the capital of Taiwan was no coincidence. Taiwan is renowned in the tech sector for leading design and manufacturing. In fact, just a month before the 5G Summit, Ericsson announced a strategic partnership with Quanta Computer, a Taiwanese leader in cloud computing. Together, they’ll be scaling design and developing data center solutions.

 

The companies present at the conference were some of the most cutting edge creators in technology today. They listened as the world’s leading telecom service providers, including Vodafone, Verizon, Bell Canada, China Mobile, Orange, and Telecom Italia, discussed their requirements for specific 5G services and use cases. The presentations emphasized that the transition to 5G will bring communication and computing together in a way we’ve never seen before, and that Taiwanese companies are positioned to play a huge part in the rollout and success of 5G.

 

2. A game changer

 

The technology involved in 5G will require small cells that connect to billions of embedded devices, and many Taiwanese companies attended the event looking to get a head start in development of 5G hardware and software.

 

Earlier this year at Mobile World Congress, Intel announced plans to collaborate with several industry leaders in an effort to accelerate the path to 5G. In fact, the connection between these two events is strong. How 5G services and requirements will differ depending on the vertical market and use cases was a discussion that started at Mobile World Congress and continued at the 5G Summit.

 

Next Generation Mobile Networks (NGMN) made a point in their presentation to emphasize that it will take a lot of work to identify unique 5G use cases and related KPIs worldwide. The low latency requirements in automotive, for example, could likely be significantly more stringent than in a typical consumer use case.

 

3. Transforming the network

 

Throughout the conference, one message was repeated over and over: 5G networks have to transform to allow easier deployment through software-defined networks (SDN) and network function virtualization (NFV) that can run on standard servers. This was especially interesting to me since Intel is actively engaging with service providers and the SDN/NFV ecosystem. We’ll be opening another NFV customer engagement center in Taipei in the second quarter of this year.

 

There was also a consensus among speakers and attendees that 5G and LTE will coexist. One presentation specified chip package size and power consumption, which gave something for the Taiwanese hardware companies to consider. But in order for these technologies to coexist, services and corresponding devices must be designed to aggregate bandwidth while maintaining reasonable power consumption. The most interesting message from several participants cautioned avoiding pre-standard solutions.

 

It was exciting to participate in the 5G Summit this year. I can’t wait to see how this technology transforms business and enables bigger leaps in technological innovation. Did you attend the conference? Please, share your takeaways.

  • Who do you think is going to be first to rollout standards-based 5G service?
  • Which verticals and use cases do you think will drive the fastest commercial adoption and where?
  • What leading companies in these verticals will benefit the most from 5G?

 

Given the healthy competition to be first among operators and countries, it will be interesting to see how this all plays out.

 

Tim Lauer is Intel’s Director of Sales for Cloud and Communication Service Providers in the APJ (Asia Pacific and Japan) region. Connect with him on Twitter and LinkedIn.

group-meeting.jpgIn mobile analytics design, the “case for small” stems from the need to effectively manage performance and response time for mobile experiences. The concept has nothing to do with smaller screens or device sizes. Instead, it deals with the delivery of the content onto those screens.

 

One of the common denominators of all mobile user experiences deals with what I call the “patience factor.” mobile users tend to be less patient about performance and response time than PC users, since they’re on the go with less time to spare.

 

On the other hand, the unmatched access and convenience of mobile makes them heavy users of the technology that’s largely influenced by their daily experiences with their mobile devices, which is all about ease of use and instant results.

 

The challenge for mobile design is that many traditional data and analytics platforms can’t handle large volumes of data on wired systems, let alone on wireless networks. Unless you’re taking advantage of the latest technology such as in-memory computing, the case for small remains undeniable.

 

Let’s take a look at two key areas where the case for small makes sense with traditional mobile analytics platforms.

 

Mobile query size


If you’re going to load data from a traditional database, the size of the underlying mobile query will undoubtedly impact the performance.

  • Use the minimum number of data elements to satisfy your business requirement to define your mobile query. It’s completely acceptable to experiment with large queries during development, but you must clean them up before going live.
  • Optimize your queries at the database level and take advantage of additional features that your analytics or business intelligence application may offer.
  • If it makes sense and provides relief, use several smaller data queries instead of one large query.
  • As an alternative to loading all data at once, consider loading data required for the initial analysis.
  • For certain requirements, think about cached data (storing values that have been computed in advance). Although this duplicates original values that are stored elsewhere, the performance gains may be invaluable—especially for certain audiences, such as senior executives.

 

Mobile analytics asset


If you put aside the constraint of the most valuable mobile design property—real estate—avoiding a bottleneck becomes dependent on how the mobile analytics asset is designed. Consider:

  • Review the number of underlying calculations that are created inside the report. Are they all really necessary or simply duplicates or leftovers from draft copies that can be eliminated?
  • Can you leverage standard definitions/calculations? If one doesn’t exist, does it make sense to create one as part of your data model instead of inside the mobile asset?
  • How do you plan to deliver the assets: Push or Pull? (Read more on this topic.)
  • How are you configuring the mobile asset in terms of the data refresh? Is it automatic (loads the latest data on open) or manual?
  • Do you have the offline capability, which will eliminate the need for a 24/7 wireless connection and the need to refresh data sets that don’t change frequently?

 

Bottom line


Mobile analytics is about delivering actionable insight—not loading thousands of rows. Such overload won’t necessarily promote faster, better-informed decision making and it’s sure to cause unnecessary headaches for everyone involved.

 

Stay tuned for my next blog in the Mobile Analytics Design series.

 

You may also like the Mobile BI Strategy series on IT Peer Network.

Connect with me on Twitter @KaanTurnali, LinkedIn and here on the IT Peer Network.

A version of this post was originally published on turnali.com and also appeared on the SAP Analytics Blog.

Filter Blog

By date:
By tag:
Get Ahead of Innovation
Continue to stay connected to the technologies, trends, and ideas that are shaping the future of the workplace with the Intel IT Center