Veeam 2020 Technology Predictions

By Dave Russell, Vice President of Enterprise Strategy at Veeam

Throughout 2019, technology has continued to have a transformative impact on businesses and communities. From the first deployments of 5G to businesses getting to grips with how they use artificial intelligence (AI), it’s been another year of rapid progress.

From an IT perspective, we have seen two major trends that will continue in 2020. The first is that on-premises and public cloud will increasingly become equal citizens. Cloud is becoming the new normal model of deployment, with 85% of businessesself-identifying as being predominantly hybrid-cloud or multi-cloud today. Related to this are the issues of cybersecurity and data privacy, which remain the top cloud concerns of IT decision makers. In 2020, cyber threats will increase rather than diminish, so businesses must ensure that 100% of their business-critical data can be recovered.

Here are some of the key technology trends that businesses will look to take advantage of and prepare for in the year ahead.

  1. Container adoption will become more mainstream.

In 2020, container adoption will lead to faster software production through more robust DevOps capabilities and Kubernetes will consolidate its status as the de facto container orchestration platform. The popularity of container adoption or ‘containerization’ is driven by two things: speed and ease. Containers are abstract data types that isolate an application from an operating system. With containers, microservices are packaged with their dependencies and configurations. This makes it faster and easier to develop, ship and deploy services. The trend towards multi cloud means businesses need data to be portable across various clouds — especially the major providers — AWS, Microsoft Azure and Google Cloud. 451 Research projects the market size of application container technologies to reach $4.3 billion by 2022and in 2020 more businesses will view containers as a fundamental part of their IT strategy. 

  • Cloud Data Management will increase data mobility and portability.

Businesses will look to Cloud Data Management to guarantee the availability of data across all storage environments in 2020. Data needs to be fluid in the hybrid and multi cloud landscape, and Cloud Data Management’s capacity to increase data mobility and portability is the reason it has become an industry in and of itself. The 2019 Veeam Cloud Data Management report revealed that organizations pledged to spend an average of $41 millionon deploying Cloud Data Management technologies this year. To meet changing customer expectations, businesses are constantly looking for new methods of making data more portable within their organization. The vision of ‘your data, when you need it, where you need it’ can only be achieved through a robust CDM strategy, so its importance will only grow over the course of next year.

  • Backup success and speed gives way to restore success and speed.

Data availability Service Level Agreements (SLAs) and expectations will rise in the next 12 months. Whereas the threshold for downtime, or any discontinuity of service, will continue to decrease. Consequently, the emphasis of the backup and recovery process has shifted towards the recovery stage. Backup used to be challenging, labor and cost-intensive. Faster networks, backup target devices, as well as improved data capture and automation capabilities have accelerated backup. According to our 2019 Cloud Data Management report, almost one-third (29%)of businesses now continuously back up and replicate high-priority applications. The main concern for businesses now is that 100% of their data is recoverable and that a full recovery is possible within minutes. As well as providing peace of mind when it comes to maintaining data availability, a full complement of backed up data can be used for research, development and testing purposes. This leveraged data helps the business make the most informed decisions on digital transformation and business acceleration strategies.

  • Everything is becoming software-defined.

Businesses will continue to pick and choose the storage technologies and hardware that work best for their organization, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites. Software-defined approaches such as IaC and Cloud-Native — a strategy which natively utilizes services and infrastructure from cloud computing providers — are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability — enabling organizations to deploy applications with speed and ease. With over three-quarters (77%)of organizations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.

  • Organizations will replace, not refresh, when it comes to backup solutions. 

In 2020, the trend towards replacement of backup technologies over augmentation will gather pace. Businesses will prioritize simplicity, flexibility and reliability of their business continuity solutions as the need to accelerate technology deployments becomes even more critical. In 2019, organizations said they had experienced an average of fiveunplanned outages in the last 12 months. Concerns over the ability of legacy vendors to guarantee data Availability are driving businesses towards total replacement of backup and recovery solutions rather than augmentation of additional backup solutions that will be used in conjunction with the legacy tool(s). The drivers away from patching and updating solutions to replacing them completely include maintenance costs, lack of virtualization and cloud capabilities, and shortcomings related to speed of data access and ease of management. Starting afresh gives businesses peace of mind that they have the right solution to meet user demands at all times.

  • All applications will become mission-critical.

The number of applications that businesses classify as mission-critical will rise during 2020 — paving the way to a landscape in which every app is considered a high-priority. Previously, organizations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult. On average, the 2019 Veeam Cloud Data Management report revealed that IT decision makers say their business can tolerate a maximum of two hours’ downtimeof mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organizations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical. 

Driving the Omnichannel Strategy with AI

By Vikram Bhat, Chief Product Officer, Capillary Technologies

The Ecommerce momentum is becoming unstoppable as brands are cashing in on a number of factors such as spike in mobile applications, a new generation of high-spending shoppers, and the availability of faster internet speeds to enable them to offer their customers a shopping experience whenever and wherever they choose. In addition, an omnichannel approach has clearly shaped the retail industry in 2019, a sector largely driving ecommerce sales in the region.

Going omnichannel is tempting for many retailers who have not yet embarked on their digital transformation journey. However, implementing an omnichannel strategy isn’t only about being present on all channels and platforms available. It is about providing a seamless and unified brand experience to customers across channels to enable them to connect with a brand and simplify their shopping experience.

A Google report further proves this is the right approach after the study found that 85 percent of shoppers start their shopping journey on one device, like a laptop for example, and end it on another, say a smartphone, or even a physical store.  

While technology is the key enabler for brands wanting to enhance their omnichannel strategy, Artificial Intelligence is another crucial component that is driving its success. But AI is only a tool and not a standalone solution, so organizations need to understand that while it can be immensely beneficial in providing customer insights, it cannot compensate for a modest or nonexistent omnichannel strategy. In short, AI needs to be a supporting element of a wider omnichannel strategy and not being implemented for the sake of being a hot technology. 

When organizations take this approach, the power of AI can truly be unleashed to boost sales and customer engagement. Let’s take a look at how AI can be applied online as well in brick-and-mortar stores.

  • Unlocking data potential: Imagine the amount of data brands have access to via multiple platforms. AI can help brands to processthis datato identify consumer spending patterns, buying preferences, customer demographics, personal preferences, and so on. 
  • Personalization:The best way AI can help brands is with the power of personalization. It allows brands to communicate with their target audience at the right time, with the right product, the right offer and message, through the right channel. Brands are able to achieve higher response rates, increased customer loyalty, and lower marketing costs
  • Image Search: AI allows consumers to search for products based on images they’ve come across. Shoppers simply take a picture and get matched to similar items on ecommerce websites. A good example is Pinterest who are leveraging this technology by allowing its users to select any item from any photograph online and then throws up similar items through an image recognition software.
  • Enhancing customer service: Chatbots are a popular and invaluable way for brands to offer24/7 customer service support on their ecommerce websites. They simulate human-like conversations with customers and can execute tasks, automate order processing, and can also provide accurate answers to customers about product details, quantities and shipping terms.
  • Generating customer insights Instore: AI deployed in physical store helpscapture, andcorrelatein store customer behaviour data and shopping preferences with digital channels like social, email, and mobile app. These insights can be passed onto the sales associates for cross-selling, up-selling and strengthening the customer engagement directly on the sales floor. 

The use of AI becomes even more powerful when combined across all channels. Organizations that realize its potential will not only drive sales and improve efficiency across platforms, but will also build a strong and loyal clientele in the long-run.

Simplifying 8 Key Authentication Terms

By: Axel Hauer Director EMEA Enterprise Sales, IAMS at HID Global

Industry jargon around authentication is practically inescapable. In today’s threat landscape, there’s certainly justification for keeping these topics front-of-mind and sparking conversation. But when authentication concepts start to get a little tangled, it can be hard to know if you’re speaking the same language as everyone else. Simplifying key terms is important to understand what they really mean, so you’ll see just how complex the world of authentication can be.

1. Strong Authentication

Strong authentication is one of those industry terms that’s been overused in so many contexts, that its significance has been blurred. 

Many people consider strong authentication to be the same as multi-factor authentication (MFA) or two-factor authentication (2FA), but if you examine the European Central Bank’s standards for strong customer authentication, there are a few more hoops to jump through than just having more than one factor:

  • There have to be at least two methods used to authenticate. These two methods should come from these three categories: something only the user knows, something only the user has, or something only the user is.
  • The methods used have to be independent of one another, meaning if one is breached, the others aren’t automatically compromised. One also has to be non-replicable (unable to be duplicated), unable to be stolen through online means, and not reusable.

Here’s a caveat, though: this term, like any term based (however loosely) on codified standards, can be a double-edged sword. Just because you’ve complied with standards doesn’t mean you’ve chosen the most secure or appropriate mix of authentication factors for your organization. Compliance matters, but strategy and thoughtful implementation matter too.

2. Authorization Creep

To understand the problem posed by authorization creep you first need to understand the difference between authentication and authorization. Authentication is when a system determines that you are who you say you are. Authorization is when the system determines what you have the right to do within the given network or application, given your authenticated identity. That’s where things can get tricky.

The problem with authorization creep, also called privilege creep, is that the threat it poses to your organization will typically have nothing to do with the strength of your authentication, but instead is all about your policies, oversight, and the ease of managing your system. The fanciest, most high-tech authentication protocols won’t mean a thing if legitimate users are over-authorized. Pretty creepy, right?

3. Biometrics
In the authentication framework, biometrics are a factor linked to something you are, and they can be incredibly difficult to steal, spoof, or lose. That’s what’s so strong about them. Typically, people think of biometrics as things linked to physical characteristics—like eyes and fingers. They’re something you’re born with, right? Not necessarily. Yes, physical characteristics that you’re born with still account for the largest portion of biometric use cases. But there’s another category: behavioral biometrics. Your voice, gait, your way of typing, and a whole host of other unique characteristics are all a part of this group. These “life measurements” are acquired over a lifetime and may change subtly, all while remaining as unique as a fingerprint.

4. Federation and Single Sign-On

To nail down the differences between these two terms, let’s start by explaining the comparatively simple structure of an SSO authentication environment. Single sign-on allows you to sign on once with a service provider for a range of services, allowing that one authentication event to give you access to a suite of services. There are plenty of services that enable SSO, and the beauty of SSO is how frictionless it is for users. 

Federation works slightly differently, as it isn’t just requesting access from a single service provider. There’s still one sign-on involved on the user’s end, but not on the back end. Instead, federation relies on a trust relationship between multiple service providers, with a single source for that trust. So, the user signs on to the source of the trust relationship (a centralized identity provider, or IDP) with all of the necessary credentials once. Attempts to access federated services will involve re-authentication through that IDP. You won’t be using credentials to access those diverse services—the IDP will be sending them out. Same time savings as SSO, and similar risks if the IDP is breached.

6. Zero Trust

A Zero Trust model says that anything coming onto your network (person, or device) has to have a positive identity that’s verified by the system. Put simply, “Trust never, always verify.” That way, access is restricted to licit users and devices: trusted entities. When hundreds or even thousands of internet-enabled devices are able to come on the network of a large organization, it’s crucial to give them access rights commensurate with what they need from the network—which shouldn’t be much.

So how does a Zero Trust security posture contribute to a safer organization? Basically, it makes sure that what’s on your network belongs there and heads off breaches by unauthorized devices that may not be properly configured. It also addresses vulnerabilities arising from use of your network’s resources by devices that may be communicating remotely over an insecure internet connection. Finally, it keeps users from bringing in their own less-secure devices and inadvertently causing a breach. No one wants to be that guy. With a Zero Trust security model, they wouldn’t get the opportunity.

7. Phishing

Phishing, as you probably know, continues to be one of the most common security scams. Through email (the usual source), text, phone, or even messaging, social media, and productivity apps, crooks attempt to steal user data. Usually, they’ll pose as a legitimate organization and steal a bit of formatting from licit communications from those organizations. The goal is to get people to click a malicious URL, log in to a fake site, or download a virus-ridden attachment.

Because it can be devastatingly successful, cybercriminals have continued to innovate. They all want to build the better phish-trap, which is why there are some new terms associated with this old-school brand of attack, such as: Spear Phishing, Whaling, Clone Phishing.

8. Internet of Things

If you think of a certain talking home speaker system or your smart oven when you think of the Internet of Things (IoT), you’re not alone. Consumer “smart” devices overwhelm the public imagination when it comes to IoT. The surface area of this ecosystem and its vulnerability to breach is enormous. A “headless” device, which has no clear user interface and may even communicate through archaic or unsecured protocols, is an attractive target for crooks. What’s crucial is to have an identity and access management solution that encompasses all of these headless devices (Zero Trust), ensuring that their access to the network is licit, and that no bad actors are hijacking the device to access your network.

The consequences of an IoT breach can be dire, but avoiding breach isn’t necessarily simple or straightforward. A great illustration of this concept is the recent discovery that one popular smart light bulb sold on Amazon was essentially cracking network security wide open for many consumers without their knowledge. This bulb stored wi-fi credentials in plaintext in its firmware and had no security settings whatsoever. Because these devices tend to utilize low-energy chipsets and low-complexity code (code that is sometimes downright archaic), they can be a welcome mat for hackers. Today’s IoT ecosystem is full of mismatched headless or limited UI devices that may be ticking time bombs.