Secrets of SaaS

My Post - 2019-09-10T175154.282.pngSoftware as a Service is driving disruption – it allows for innovation to develop at the speed which both the market and customers expect.

8 ways the cloud is more complex than you think

My Post - 2019-09-10T174358.388.pngThe cloud may spark automated push-button fantasies, but the reality of shifting to cloud services is quite a bit more complicated.

For a growing number of organizations, it’s not a question of whether they should move applications and development platforms to the cloud, but when.

The cloud has become so well entrenched in corporate IT that it’s increasingly difficult to imagine business without it. Still, the move to cloud services is not without its share of hardships, some of which can be totally unexpected.

A recent report by professional services and consulting firm Accenture notes that two-thirds of large enterprises are not realizing the full benefits of their cloud migration journeys, with the complexity of business and operational change among the key barriers.

Of the 200 senior IT professionals from large businesses surveyed, 55 percent cited business complexity and organizational change as a barrier to realizing the benefits of the cloud. Only security and compliance risk was cited more frequently. – Read more

How the future of cloud computing is impacting businesses

My Post - 2019-09-10T170627.977.png“The cloud” is enabling substantial change in how we store, access, and process data in nearly every aspect of our lives.

To illustrate that change, let’s first focus on life B.C. or before cloud. Before the cloud, information was stored on computers or local devices. If you needed that information, you had to be connected to that storage device or computer. Likewise, you needed a computer — often an expensive one — with enough power to process the information.

This B.C. world was static, risky, and inefficient.

It was static because it did not enable any true mobility. Data was stored in the same way traditional books are stored in libraries. If you wanted to read another book, you had to be in a specific location.

This world was also risky, as the loss of the computer or storage device meant the information was also gone. Like the library, if the building burnt down, the books would be destroyed.

The inefficiency stems from the need to have many separate devices able to store and process the information. These devices were costly, power hungry, and made obsolete on a regular basis.

Life within the cloud provides solutions to those problems. Gathering up all this diffused information and moving it to the cloud means that the data is available anywhere. The risk of data loss is mitigated because information is distributed among many data centers rather than at one location.

While tech companies love to sell the latest and greatest devices to keep up with the demands of applications, this will change. Instead of having many high-powered, expensive devices to store and process information, there will instead be longer-lasting, lower-powered devices that need only display the information. Upgrades in the cloud will bring new capabilities without changing out the user’s device. – Read more

How to mitigate the privacy risk to cloud-stored data

My Post - 2019-09-10T173626.957.pngLarge-scale cloud deployment brings huge advantages, if the risks are correctly managed.

The cloud offers a multitude of advantages, however as with any large-scale deployment, it can also provide unforeseen challenges.

The concept of the cloud being “someone else’s data centre” has long irked security pros– it reinforces the notion that security responsibility is someone else’s problem.

It is true, cloud systems, networks and applications are not physically located within a company’s environment. Cloud infrastructure providers manage how the environment is set up and monitored, as well as what is put into it and how data is protected.

But ongoing security responsibility and risk mitigation certainly falls squarely with the customer and what is most important is how risk is managed to provide alignment with the existing security framework.

Cloud security privacy risks

GDPR and its ‘sister’ policies in the US (as seen with Arizona, Colorado and California) have meant organizations are being faced with new requirements for protecting data in the cloud.

While it used to be as simple as deploying Data Loss Prevention (DLP) in a data center, nowadays, due to data center fragmentation, this is no longer viable. There are now services, systems and infrastructure that are no longer owned by the organization, but still require visibility and control.

Managing cloud services and infrastructures that share or exchange information can also become difficult to manage. For example, who owns the SLAs? Is there a single pane of glass that monitors everything?

DevOps has forced corporations to go as far as implementing micro-segmentation and adjusting processes around firewall rule change management. Additionally, serverless computing has provided organisations with a means by which they can cut costs and speed productivity by allowing developers to run code without having to worry about platforms and infrastructure.

Yet, without a firm handle on virtual private clouds and workload deployments, things can quickly spin out of control and data can begin leaking from one environment just as a comfortable level of security is achieved in another. – Read more

Understanding hybrid cloud security across your enterprise

My Post - 2019-09-10T165345.429.pngHybrid cloud security is vital as enterprises continue to develop their hybrid cloud services, since more sensitive information has the potential to become exposed.

Discover how enterprises are approaching their hybrid cloud security protocols as their hybrid clouds develop and expand. Is perimeter security enough today? Businesses continue to embrace the power and flexibility.

Hybrid cloud security is vital as enterprises continue to develop their hybrid cloud services, since more sensitive information has the potential to become exposed. Discover how enterprises are approaching their hybrid cloud security protocols as their hybrid clouds develop and expand.

Is perimeter security enough today?

Businesses continue to embrace the power and flexibility of the hybrid cloud. Ensuring these networks are secure is paramount. Used to securing siloed data, and managing secure logins for applications, CTOs and their teams have had to embrace a brave new world. In this new security environment, sensitive information can be outside of their enterprise’s firewalls. Here, a robust hybrid cloud security policy is critical.

In a hybrid cloud security environment, there are particular threats that CIOs and CTOs must be aware of: The hybrid cloud doesn’t necessarily bring with it a new set of security issues to mitigate and defend against.

Data breaches, ransomware, phishing attacks and BEC scams are a present danger to traditional networks. Moving to a cloud environment may potentially amplify the attacks. It is, however, a mistake to think that the hybrid cloud necessarily brings new and unique threats. Security should be treated as all-encompassing no matter what kind of cloud services are in use.

Also, enterprises often believe that once a hybrid cloud is set-up, the responsibility for the hybrid cloud security shifts to the vendor or service provider.

Research in the report ‘Cloud and hybrid environments: The state of security’ from Algosec concluded, 58% of respondents use the cloud provider’s native security controls to secure their cloud deployments.

With 44% saying they also use third-party firewalls deployed in their cloud environment. These environments were specifically Cisco Adaptive Security Virtual Appliance, Palo Alto Networks VM Series, Check Point vSEC, Fortinet FortiGate-VM and Juniper vSRX). This created a mixed estate of traditional and virtualized firewalls, and cloud security controls. – Read more

How managed cloud service providers bring benefits

My Post - 2019-09-10T172735.044.pngThe emergence and success of the cloud over the last decade is unquestioned.

It is nearly impossible to read any current IT publication that does not cover the growth, innovation and emerging use cases that are enabled by cloud.

And for most, the word “cloud” is synonymous with Amazon Web Services, Microsoft Azure and Google, writes Danny Allan, vice-president: product strategy at Veeam.

This makes sense, as they are by far the most recognised and successful providers in the hyperscale public cloud. And yet, these three leaders in the industry are not destined for complete market domination. There not only remains a place for the Managed Cloud Service Provider (MSCP), but there are multiple reasons why they are essential and often the better choice for a significant majority of the IT market.

Public cloud infrastructure requires specific expertise

Many organisations will often start with the false premise that they can simply pick up the on-premises infrastructure and move it over to the hyper-scale public cloud. While this has often been the promise and the general infrastructure is similar, the reality is that many basic elements such as control planes, networking and security are different enough that challenges very quickly emerge.

Choosing a MCSP helps in one of two ways: the hosted service environment can closely mirror the on-premises environment, or a managed service can effectively broker the public cloud and introduce the cloud expertise necessary to integrate two different environments.

In an ecosystem where time-to-offering is an essential competitive advantage, this value cannot be under-estimated. In fact, it is very likely that the MCSP community will evolve into a front line brokering of the hyper-scale public clouds, while facilitating and managing the transition and hybrid environment

Cloud economics need to be effectively managed

“The cloud is not a charity,” is one of my favorite statements. The ability for cloud to drive profit is based on the ability to layer in margin. While the public cloud can be very effective for elastic workloads with a high degree of variability, placing workloads in a remote location or for taking advantage of a pre-configured service, they can be significantly less cost effective for static workloads with predictable infrastructure needs.

As the various public clouds all vie for market dominance and customers choose which workloads are best suited to the public cloud benefits, MCSPs offer the ability to abstract the workload from the public cloud, while closely monitoring the cost characteristics and shifting the data and service based on the customer ROI.

This is a distinct and definite value add that most customers are unable to measure and recognise.

Many services offer a superior end user experience in a decentralised model – Read more

How to prevent the top 11 threats in cloud computing

My Post - 2019-09-06T164713.217.pngThe latest risks involved in cloud computing point to problems related to configuration and authentication rather than the traditional focus on malware and vulnerabilities, according to a new Cloud Security Alliance report.

Using the cloud to host your business’s data, applications, and other assets offers several benefits in terms of management, access, and scalability. But the cloud also presents certain security risks. Traditionally, those risks have centered on areas such as denial of service, data loss, malware, and system vulnerabilities. A report released Tuesday by the Cloud Security Alliance argues that the latest threats in cloud security have now shifted to decisions made around cloud strategy and implementation.

Based on a survey of 241 industry experts on security issues in the cloud industry, the CSA’s report Top Threats to Cloud Computing:The Egregious 11 focused on 11 notable threats, risks, and vulnerabilities in cloud environments. For each threat described, the report highlights the business impact, specific examples, and recommendations in the form of key takeaways.

1. Data breaches

A data breach can be any cybersecurity incident or attack in which sensitive or confidential information is viewed, stolen, or used by an unauthorized individual.

Business Impact

  • Data breaches can damage a company’s reputation and foster mistrust from customers and partners.
  • A breach can lead to the loss of intellectual property (IP) to competitors, impacting the release of a new product.
  • Regulatory implications many result in financial loss.
  • Impact to a company’s brand could affect its market value.
  • Legal and contractual liabilities may arise.
  • Financial expenses may occur as a result of incident response and forensics.

Key Takeaways and Recommendations

  • Defining the business value of data and the impact of its loss is essential for organizations that own or process data.
  • Protecting data is evolving into a question of who has access to it.
  • Data accessible via the Internet is the most vulnerable asset for misconfiguration or exploitation.
  • Encryption techniques can protect data but can also hamper system performance and make applications less user-friendly.
  • A robust and well-tested incident response plan that considers the cloud provider and data privacy laws can help data breach victims recover.

2. Misconfiguration and inadequate change control

Misconfiguration occurs when computing assets are set up incorrectly, leaving them vulnerable to malicious activity. Some examples of misconfiguration include: Unsecured data storage elements or containers, excessive permissions, unchanged default credentials and configuration settings, standard security controls left disabled, unpatched systems and logging or monitoring left disabled, and unrestricted access to ports and services. – Read more

Cloud computing is great, but it isn’t a backup system

My Post - 2019-09-06T163542.609One of the great things about the modern internet is the way in which we can share and move files around. But if you are dealing with important information, cloud computing is in no way a good backup system.

Sometimes it’s hard to know how excited to be about the thing we’re now calling “the cloud.” In one sense, it’s not a particularly new idea: most of us have been using something we could reasonably call “cloud email” since the 1990s. Where’s your email stored? Doesn’t matter; it’s on a server somewhere. Of course, we could write off much of the modern cloud as “a server somewhere.” What’s changed is that everyone’s internet connections, even wireless internet connections over large areas, have made it possible to do a lot more with the system than just tiny fragments of data such as emails.

In a lot of the world, and particularly here in the UK, governments and corporations seem to have concluded that there’s a certain maximum amount of money people are willing to pay for an internet connection, and that making greater provision doesn’t necessarily mean more income for a service provider. That’s a shame, because it risks denying a number of possible futures for the internet in general and the cloud – let’s call it wide-area distributed computing – in particular. Even so, the capability available is now enough to mean anyone who encounters a short-notice and temporary need for several terabytes of storage can have it, even if a bit slowly on the average home broadband connection. Hopefully, that’ll be a statement we can come back to and laugh about in a few years.

One of the most popular and encouraging things about the cloud is that reliability isn’t the user’s problem. At some level, cloud storage still (almost invariably) boils down to some hard disks in a rack somewhere, but we can quite safely assume that those hard disks will be part of some larger reliability arrangements, whether a conventional disk array or, more likely, some sort of vast distributed object storage system, the details of which are hidden from, and irrelevant to, the user. Either way, there will be redundancy built in. Usually, those racks full of servers are kept in deliberately nondescript buildings on industrial estates round the world.

The simplest cloud provision might not, or in fact probably would not, even let a user know where the data is physically located; as we know from Amazon’s example, these places are somewhat security sensitive. Some cloud providers, though, offer the option to specifically keep data in more than one of these locations, perhaps several at once. That creates a really impressive degree of what the information technology industry terms “disaster recovery.” If something genuinely catastrophic happens – fire, flood, meteor strike, zombie apocalypse – then there’s a huge resistance to actually losing any data. Great! Now we don’t need that LTO drive, anymore, do we? – Read more

The Essential SaaS Metrics for Growth

My Post - 2019-08-30T161032.726“If you cannot measure it,” declared Lord Kelvin, “you cannot improve it.” Perhaps SaaS companies have taken this advice too literally. 

SaaS sales and marketing teams can get overwhelmed by metrics. But without any metrics, it’s impossible to track growth. And without growth, a SaaS company is dead in the water.

According to Statista, the SaaS market will reach $157 billion next year. And while that figure is promising, early-stage SaaS companies need a ton of growth to survive. In fact, SaaS companies with an annual growth rate of 20% or less have a 92% chance of failure, according to research by McKinsey.

That same research found that “super growers” were eight times more likely than “stallers” to grow from $100 million to $1 billion, and three times more likely to do so than “growers.”

If growth is the best way to get out alive, marketing metrics do little unless they correlate with sales. After talking with a bunch of SaaS experts, here’s what I learned about which SaaS metrics deserve focus—and which ones don’t.  – Read more

Forget About SaaS: Software-as-a-Lender Could Be The Next Big Thing

My Post - 2019-08-13T154614.889Ella is a first time business owner who started a landscaping company two years ago that is now growing faster than she ever could have imagined.

In order to keep up with the rush of new business, she needs a loan as soon as possible to hire additional employees and buy more equipment. If she doesn’t get it, she’ll need to turn away new customers – something that could cripple her company’s reputation. She has been to three banks and called a dozen more but the answer is the same from all of them – they’ll need hundreds of documents and three months to review them before they can even consider her for a loan. This is time and resources Ella does not have.

The next day, however, she gets a business saving email from her software-as-a-service (SaaS) provider, Green Software Systems. Ella uses Green Software to manage her scheduling, payroll, job status, and communication with employees. They also process all of Ella’s payments, making them a SaaS + Payments provider, which allows them to be in Ella’s payment flow collecting verified information about her business performance, and they have just announced they are launching a new lending product for their customers. With the click of a button, Green Software could issue Ella a one year loan to cover the new employees and equipment. It is the lifeline Ella needs and just like that, she is armed with the capital to expand her business and take on new customers.

Many small business owners constantly find themselves in Ella’s position, but the process of applying for a loan and getting approved is daunting and time-consuming. For example, if you’re applying for a Small Business Administration (SBA) loan, you typically have to provide a long list of documents: A business profile, resumes for “each owner and key member of management,” personal and business financial statements, cash flow projections, and many other statements and disclosures. The entire process takes an average of 60 to 90 days. Even if completed, there is still an 82% chance they will be denied. Even if approved, they still have to deal with the administrative headache of providing continuous and manual documentation after the loan is issued.

SaaS Companies Should Expand their TAM by Moving into Adjacent Products & Services

As we discussed in a prior article, many vertical SaaS companies like the one Ella uses have added payment processing functionality in order to expand their total addressable market (TAM) and become vertical SaaS + Payment companies. In the article, we used the example of Toast, a staff management, POS, and payment processor for the restaurant industry. As shown below, Toast increased their TAM 3.5x by incorporating payment processing into their SaaS solution. – Read more