Other Blogs
There are no items in this list.
Knowledge & Insights > ISACA Now > Categories
Before You Commit to a Vendor, Consider Your Exit Strategy

Baan AlsinawiVendor lock-in. What is it? Vendor lock-in occurs when you adopt a product or service for your business, and then find yourself locked in, unable to easily transition to a competitor's product or service. Vendor lock-in is becoming more prevalent as we migrate from legacy IT models to the plethora of sophisticated cloud services offering rapid scalability and elasticity, while fueling creativity and minimizing costs.

However, as we rush to take advantage of what the cloud has to offer, we should plan strategically for vendor lock-in. What happens if you find another cloud provider that you prefer? How will you migrate your services? What are the costs, how disruptive will it be, and will you have the professional talent to transition successfully?

As a vendor, locking in customers by ensuring that they cannot easily transition elsewhere is smart business. However, as a buyer looking for innovative solutions and a better value for services, you require flexibility if your business needs change, or if a vendor is no longer available due to bankruptcy or restructuring.

As you adopt a growing array of cloud-based anything-as-a-service (XaaS) to outsource your business support functions—from Salesforce to AWS services, Google docs to Microsoft Office 365—consider your exit strategy if your business needs change, or your vendor is no longer available.

Take a step back and consider vendor lock-in as part of your overall risk management strategy. A single cloud provider can offer great options for redundancy, risk management and design innovation. But what happens when you consider redundancy across multiple providers? How easy is it to have a primary service on AWS and a secondary/backup on Google? Not so easy.

Best practices suggest that you shouldn’t put all your eggs in one basket. However, developing a SaaS solution designed to work on two disparate cloud services is a complex undertaking. If you are simply using the cloud for storage/raw data backup, you may be able to transfer data between providers. Even then, you need to pay attention to data structures and standards across platforms. When developing complex solutions that rely on outsourced technologies such as AWS continuous development/continuous integration (CD/CI), Splunk Cloud for auditing, or Qualys Cloud for vulnerability scanning, how much redundancy and portability are you baking into your risk management strategy?

Also, what happens if AWS is no longer available? This seems highly unlikely today, with their stocks hovering at around US $2K a share. But what if your new CIO decides Azure offers better widgets? Or your CISO wants a primary platform on AWS and a backup on Oracle? There are vast differences in these platforms, and one development effort will not be easily portable to the other.

For example, TalaTek is developing its own next-generation cloud-based solution for its current platform. We must consider the additional time, multiple developers and increased complexity required to operate on two different cloud platforms to manage this risk. The question we ask is can we afford not to plan for an exit strategy if our strategic business goals were to change?

Acknowledging the risk, and in some cases accepting it, is a key aspect of risk management. TalaTek has accepted the risk in adopting a single cloud platform, since it makes business sense to do so.

What should you consider when adopting cloud-based services? Here are our top five considerations:

  • Have a resilient risk management strategy that requires you to continuously re-evaluate your risk assumptions and diligently monitor market changes.
  • Negotiate strong service-level agreements, vetted by legal experts, in the design of your cloud strategy.
  • Align your business and IT/cloud strategies to protect your investments and ensure continuity of operations.
  • Where possible, use open source stacks and standard API structure to provide portability and interoperability.
  • Consider whether your risk tolerance allows you to accept some risk. If you are offering a SaaS solution to manage your client’s CRM, your risk tolerance risk is different from that of a hospital using the cloud to manage all of its client health data.

The cloud is here to stay. Assess your options, be smart about your strategy, and consider your exit options as you embark on the exciting journey into the cloud.

FedRAMP: Friend or Foe for Cloud Security?

Baan AlsinawiCloud security is on everyone’s minds these days. You can’t go a day without reading about an organization either planning its move to the cloud or actively deploying a cloud-based architecture. A great example is the latest news about the US Department of Defense and its ongoing move to the cloud.

The US government is leading the charge by encouraging the private sector to provide secure cloud service offerings that enable federal agencies to adopt the cloud-first policy (established by the Office of Management and Budget in 2016) using FedRAMP. FedRAMP is a US government-wide approach for security assessment, authorization and continuous monitoring for cloud products and services. It sets a high bar for compliance with standards that ensure effective risk management of cloud systems used by the federal government.

There is even some chatter now about efforts to establish FedRAMP as a law, in an effort to encourage agencies to adopt the cloud at a more rapid pace. The delay in adoption is by no small measure related to the complexity, the intensive resource requirements of the current FedRAMP processes and finding providers that are FedRAMP-certified. 

One of the main considerations to the adoption of FedRAMP on a wider scale is the difficulty for the industry, Third Party Assessment Organization (3PAO) and Cloud Service Providers (CSP) to determine what the profitability model is for engaging in the FedRAMP program.

Establishing such metrics can offer key drivers for industry adoption, perhaps by allowing CSPs to determine how offering FedRAMP-accredited IaaS/SaaS/PaaS can be truly beneficial and profitable for the company’s bottom line, at the same time allowing the agencies to determine the cost effectiveness of a move to the cloud.

While achieving FedRAMP accreditation has many challenges (as TalaTek learned over the past 18 months during deployment of its own cloud-based solution), there are clear benefits for the federal agencies and the industry to work with a FedRAMP-authorized service providers. At a high level, these include an established trust in the effectiveness of implemented controls and improvement of data protection measures. 

Despite the many challenges for adoption, I am a big believer in the benefits outweighing the challenges of the FedRAMP program, especially in the long run, after the kinks are ironed out and the program maturity improves through increased adoption of both government and private industry.

The FedRAMP program provides significant value by increasing protection of data in a consistent and efficient manner – a key need among government organizations and especially among information sharing agencies – by providing these key benefits: 

  • Enables a more successful move to the cloud for federal agencies;
  • Ensures a minimum security baseline for all cloud services; 
  • Provides managed security continuity for a cloud offering versus a onetime compliance activity;
  • Standardizes requirements for all cloud service providers; and
  • Creates a 3PAO cadre that is capable, certified and can ensure quality assurance for cloud implementations.

By providing a unified, government-wide framework for managing risk, FedRAMP overcomes the downside of duplication of effort and inefficiency associated with existing federal assessment and authorization processes.

When considering a move to the cloud and the level of security that is necessary, we should all take risk management seriously and invest in skill development and knowledge, as well as in adapting the processes for the 21st century and getting ready for the reality of the dominance of the cloud in our near future. FedRAMP provides the roadmap for any organization to achieve these goals.

Key Takeaways from a Recent Cloud Training

Adam KohnkeI recently began taking my first crack at auditing an Amazon cloud platform that comprises over a dozen managed services. While I was excited to add this new wrinkle to my skill set, I had no idea where to get started on identifying key risks applicable to each service or how to approach the engagement. Searching online eventually led me to the AWS training and certification website. My intuition initially suggested to me that Amazon was not very likely to help me audit their services, or even if they did, there probably would not be much free information available that I could leverage to obtain sufficient understanding of the service architecture or operation. Well, I was dead wrong!

This blog post covers my experience with the free Cloud Practitioner Essentials course offered by Amazon and some of key takeaways I obtained from various sections in this training.

Topics covered and use for auditors
The course takes about eight hours to complete, divided among five major sections (Cloud Concepts, Core Services, Architecture, Security, Pricing and Support). If if you can forgive Amazon’s “pointing two thumbs at itself” advertising tone, you can start picking up key risks and audit focus areas as you march on. My key takeaway from the Core service overview portion of the course was about Trusted Advisor, which is a management tool auditors can utilize to obtain a very quick glance at how well their organization is utilizing the collection of services. Trusted Advisor highlights whether cost optimization of services is being achieved while providing recommendations on how to improve service usage. It also reports on whether performance opportunities exist, if major security flaws are present, such as not utilizing multifactor authentication on accounts, and the degree of fault tolerance that exists across your compute instances. Trusted Advisor is not the be-all, end-all as it uses a limited set of checks, but provides a quick health check against the environment. A list of Identity & Account Management (IAM) best practices were also provided along with details on how to validate them.

Security, shared responsibility and suggested best practices
Amazon does a good job of pounding the shared responsibility security concept into your head for managing its services and touches on this concept throughout the training. Amazon is responsible for Security of the Cloud (physical security, hardware/firmware patching, DR, etc.) and the customer is responsible for Security in the cloud (controlling access to data and services, firewall rules, etc.). My key takeaway from this portion of the training centered on some of the best security practices for managing your root user accounts, issued to you upon establishing web services with Amazon. Root accounts are the superuser accounts that have complete access and control over every managed service in your portfolio. Because you cannot restrict the abilities of the root account, Amazon recommends deleting the access keys associated to the root account, establishing an IAM administrative user account, granting that account necessary admin permissions and using that account for managing services and security.

Detailed and helpful audit resources
The pricing and support section of the training provided very useful metrics that financial auditors and purchasing personnel should be interested in to help them determine the cost and efficiency of managed services. This portion of the course covers fundamental cost drivers (compute, storage and data transfer out) and very specific cost considerations for each fundamental cost driver, such as clock time per instance, level of monitoring, etc.

My key takeaway from this portion of the course was the overview of the support options – more  specifically the audit white papers that are maintained and made available at no cost. There are very specific security audit guides, best practices for security, operational checklists and other information that will allow an auditor to build an accurate and useful engagement program. 

This course led me to the Security Fundamentals course, which provides enhanced audit focus on several topics of interest to IT auditors, including access management, logging, encryption and network security.

In conclusion, I am surprised and extremely impressed with the amount and depth of free resources Amazon makes available to the auditing community regarding its services. I wish other providers would take similar measures to assist the information security community with understanding what is important and how to gain assurance on its secure operations.

Editor’s note: View ISACA’s audit/assurance programs here.

The Impact of Net Neutrality on Cloud Computing

Marty PuranikThe US Federal Communications Commission (FCC) recently repealed the net neutrality guidelines that it implemented less than three years ago. There has been much discussion, speculation and concern about how that move will impact the Internet, small business and consumers. Many people have suggested that one effect of the repeal will be that video streaming and other cloud hosted, web-delivered media will start to cost much more for the consumer.

It became unlawful for broadband providers to decide to slow down or block certain web traffic when the 2015 regulations were enacted by the FCC. Actually, the 2015 rules did not incorporate enterprise web access, which is often custom. They did, however, safeguard the flow of data to small businesses.

FCC chair Ajit Pai and other lawmakers take the position that the policies and practices of net neutrality are unnecessary rules which make it less likely that people will invest in broadband networks, placing an unfair strain on internet service providers (ISPs). That perspective does not seem to be aligned with that of the public, according to a poll from the University of Maryland showing that 83% of voters favored keeping the net neutrality rules in place.

What is net neutrality, exactly?
The basic notion behind the concept of net neutrality, according to a report by Don Sheppard in IT World Canada, is that the government should ensure that both all bits of data and all information providers are treated in the same way.

Net neutrality makes it illegal to have paid priority traffic, throttle, block, or perform similar tasks (see below).

Sheppard noted that there are two basic technical principles related to internet standards that are part of the basis for net neutrality:

  • End-to-end principle – Functions related to applications should not take place at intermediate nodes but instead at endpoints within networks used for general purposes.
  • Best efforts delivery – There can be no performance guarantee but instead a demand for best efforts for equal, nondiscriminatory packet delivery.

IoT: harder for startups to compete?
Growth of the Internet of Things (IoT) is closely connected to the expansion of cloud computing – since the former standard uses the latter as its backend. In terms of impact on the IoT, Nick Brown noted in IOT For All that the repeal of net neutrality will result in an uneven playing field in which it will become more difficult for smaller organizations, while larger firms will be able to form tighter relationships with ISPs.

The issue of greater latency is key to the removal of net neutrality, because latency could arise as sites are throttled (decelerated). The reason that throttling would occur between one device and another is that ISPs may want some devices (perhaps ones they build themselves) to have better performance than others.

Some people think the impact of the net neutrality repeal on the IoT will be relatively minor. However, many thought leaders think there will be a significant effect since IoT devices rely so heavily on real-time analysis.

Entering a pay-to-play era
Throttling, or slowing throughput, could occur with video streaming services and other sites. Individual cloud services could be throttled. Enterprises could have difficulty with apps that they host in their own data centers, too, since those apps require a fast internet connection to function as well.

There will be a pay-to-play scenario for web traffic instead of just using bandwidth to set prices, according to Forrester infrastructure analyst Sophia Vargas.

There is competition between wired and wireless services that has resulted from changes to their pricing models following the repeal of net neutrality, said Vargas. The pricing is per bandwidth for wired, landline services, while it is per data for wireless services. Wireless services will have the most difficulty because wired services are controlled by a smaller number of ISPs.

There will be more negotiations and volatility in the wireless than in the wired market, noted Vargas. Competition is occurring “in the ability for enterprises to essentially own or get more dedicated [wired] circuits for themselves to guarantee the quality of service on the backend,” she added.

Does net neutrality really matter?
The extent to which people are committing themselves to one side or the other gives a sense of how critical net neutrality is from a political, commercial and technical perspective. A consumer should be aware of the potential for companies to mistreat them without these protections in place (which is not to say those abuses will occur).

Ways that ISPs could perform in a manner that go against the precepts of net neutrality are:

  • Throttling – Some services or sites could be treated with slower or faster speeds.
  • Blocking – Getting to the services or sites of competitors to the ISP could become impossible because those sites are blocked.
  • Paid prioritization – Certain websites, such as social media powerhouses, could pay to get better performance (in reliability and speed) than is granted to competitors that may not have the same capital to influence the ISP.
  • Cross-subsidization – This process occurs when a provider offers discounted or free access to additional services.
  • Redirection – Web traffic is sent from a user's intended site to a competitor’s site.

Rethinking mobile apps
Another aspect of technology that will need to be rethought in the post-repeal world is improving efficiency by developing less resource-intensive mobile apps that are delivered through more geographically distributed infrastructure. Local caching could also help, and delivery of apps that serve video and images should potentially be restructured.

You can already look at file size to create better balance in the way you deliver video and images to mobile users. However, the rendering, quantity that is stream-loaded (to avoid additional pings), and other aspects are optimized with net neutrality as a given.

Providers of content delivery networks (CDNs) will need to re-strategize the methods they use to optimize enterprise traffic.

Cost has been relatively controlled in the past, according to Vargas. There is an arena of performance management and wireless area network (WAN) optimization software that was created to manage speed and reliability for data centers and mobile. Those applications will no longer work correctly because they were engineered with traffic equality as a defining principle. Hence, providers will have to adapt to meet the guidelines of the new paradigm.

The Importance of Securing Your Cloud

Marty PuranikOne of the biggest misconceptions regarding the cloud is that you can rely on the cloud provider service to protect your business, your data and everything else your firm holds dear.

Take a minute to think about your own home security system. Do you just lock the doors with the key and head off to work, fully secure that your valuables will still be there when you get back? Not likely. Many of us have at least a simple alarm system in place on doors and windows. More and more people are heading toward the latest trends in home security: motion sensors, 24-hour video cameras, remote door answering, etc.

Why does securing your cloud matter? Three enormous reasons:

  • Your cloud provider is only managing part of your security.
  • Cloud security lowers the risk of data breaches.
  • The minimum level of security compliance should never be enough.

Your security vs. cloud security
Let’s talk about your security against the cloud service provider’s security. The provider has specific language in any contract it signs with you concerning what it is and isn’t responsible for if there is a security breach. In its 2016 “Cloud Adoption & Risk Report,” SkyHigh Networks reported that the average user in an organization employed 36 different cloud services at work. That’s 36 potential security breach points into your cloud and 36 ways for information to leak out. By introducing all of the apps you need to make your business run to your cloud environment, you must take on the responsibility of ensuring that they are only serving their necessary capacity when analyzing and manipulating the data stored in your cloud.

It is integral that you manage all of your cloud-based applications and treat them all as security risks until the day you can scratch them off that list. The old days of hiring a third-party app to plug-and-play into your network are long gone. Your best way forward should be with a Security-as-a-Service (SECaaS) solution. Just like your infrastructure, software and your share of the cloud itself, SECaaS is the scalable solution that can handle your growth but also downgrade in the event your business shrinks. Even an in-person, onsite IT expert is not available 24 hours a day, 7 days a week, but a SECaaS is. The service can deploy solutions instantaneously when problems or suspicious activities arise, unlike in a traditional setting where everyone is waiting around for the IT professional to respond to a call for help.

The high price of data breaches
As for breaches, a 2016 study showed that the estimated cost of a data breach for a company is US $4 million. If your company has an extra $4 million lying around, by all means don’t fret about your cloud security. That figure might seem high at first glance, but there’s far more at work here than merely a loss of data or intellectual property. When you take a public data breach, word travels fast. Your best employees will be more receptive to offers from competitors. Your recruitment will suffer as those entering the workforce and those seeking to switch employers will take a lot harder look at what sort of company gets breached and what kind of company they’re looking to work for. And last but not least is the impact your data breach will have on your company’s public perception. The public has an incredibly long memory when it comes to embarrassing incidents for public companies. Don’t believe it? Fast-food giant Jack in the Box had a scare with mislabeled meat in 1981, and 37 years later, it’s still one of the top Google results for the restaurant chain.

Nobody wants the minimum
You didn’t get into business to do the bare minimum when it comes to protecting your assets and your customers’ information. No salesman has ever told a customer that he’d do the absolute least amount of work he could to get the customer’s business. The same excellence you strive for in taking command of your market and maximizing your profits should be applied to keeping your cloud secure.

To ensure the security of your cloud, consider adding dimensions such as multifactor security, where even if an employee’s login name and password are stolen or compromised, the party that took it still cannot access your cloud without an additional layer of security. Simple steps like this can be the difference between a secure cloud system and one just waiting to be picked apart by hackers.

What Role Will IoT Play in Edge Computing?

Adam KohnkeWhile no one doubts the power that cloud computing has on our present and future digital needs, it still has basic flaws that are cause for alarm: notably concerns over privacy of data and its ability to handle large-scale, constant computations.

The Internet of Things (IoT) continues growing at an exponential rate. Its market is estimated to reach $457 billion by 2020, a jump of 28.5% from 2016. But concerns still loom about its shortcomings when synced up to the cloud, a problem that has tied edge computing to the IoT.

If your business depends on IoT devices being able to parse data seamlessly in real-time to provide instant analysis for your processes and people, then edge computing is not some hazy future vision, but your solution for today’s problems.

Edge computing does what the cloud can’t for IoT: it reduces latency and gives the opportunity for faster processing for IoT devices that are attempting to operate in real time. Things like the new prototype self-driving cars or sensors in hospital rooms tasked with making decisions as vital signs ebb and flow risk catastrophic events if they are unable to process data instantly without delay.

Here are the three biggest reasons your company should be employing edge computing for IoT right now.

Reduced latency. Sometime this year, IoT devices will surpass cellphones in terms of number of connected devices. That’s pretty staggering for a technology most people had no idea existed five years ago. Having the edge computing device located far closer to the IoT object can drop latency speeds dramatically. Edge computing will also determine each device’s processing needs and adjust accordingly; the entirety of a company’s cloud space does not have to come online for the computations of one small IoT unit.

Upgraded network connections. Edge computing guarantees that cloud outages won’t affect individual devices by limiting interactions with the cloud. Only essential functions will be run through the cloud, which will in turn ease the burden on that environment to perform its own functions. Apps that exist solely in the cloud will have more processing power without having to compete for bandwidth against IoT machines that can successfully exist on the network’s edge.

More cohesive privacy. With new laws like the EU’s General Data Protection Regulation (GDPR) coming online in the near future, data privacy has never been a bigger topic in the digital realm. In its earlier forms, security was a low-grade concern for IoT devices like digital thermostats. But as a legion of microphones, cameras and other personal input devices join the IoT, the threat of data loss or theft becomes more real and more harmful. Edge computing can take a considerable strain off making sure data is secure by performing a number of the data’s required computational steps on the machine itself. The network can then send the data along to the cloud after it has been changed, enhancing both the speed at which it is processed in the cloud and encrypting it to lower the risk of theft.

Author’s note: Established in 1994, Atlantic.Net is a hosting solutions provider, with data centers in New York City, San Francisco, Dallas, Toronto, London and Orlando. Marty Puranik is the founder, president and CEO.

Conducting Cloud ROI Analysis May No Longer Be Necessary

Chris RichterISACA’s newly released report, How Enterprises Are Calculating Cloud ROI, is a landmark piece of research that, in my opinion, validates the notion that we have reached (or are at least rapidly approaching) that tipping-point where organizations realize that moving their IT infrastructures to the cloud is an inevitable, foregone conclusion. The white paper documents the growing trend for organizations to forgo financial ROI analyses as a way to justify investment in cloud computing, instead resorting to intangible returns, such as better application performance, enhanced business agility and improved customer- and employee-experience (Cx/Ex).

But could there also be a more subjective reason fewer companies are performing ROI analyses? There definitely is a growing perception that forward-thinking companies are moving away from internally managed data centers and infrastructure. In many of my interactions with IT professionals from major, Fortune 500 enterprises, I have heard comments like, “To stay competitive, we’re moving all applications to the cloud; anything we can’t move will die in place … ” and, “Our board of directors is asking why we are not taking more advantage of the cloud.”

The age of digital transformation is demanding our attention and driving us to a foregone conclusion: we cannot continue to do it all ourselves. As IT professionals, we are asked to reduce costs, increase productivity, stay a few steps ahead of cyber criminals, support the needs of growing mobile workforces, accommodate hybrid networks, improve Cx/Ex, all the while improving network and application performance. Enterprises, and even governments, are also under pressure to appear competitively on the leading edge by outwardly embracing digital transformation. All of this cannot be accomplished by building on top of traditional IT infrastructure and management models.

The digital age is moving at such a fast pace there may be a general sense that there isn’t time for formal ROI rigor, or that the concept of due-care is taking hold, such that it is now considered irregular and imprudent to not migrate IT functions to the cloud. There was a time when office buildings had their own telephone switchboard operators, and textile mills and factories generated their own electricity.  When the automated PBX and centrally distributed AC power emerged, it’s doubtful many organizations did an ROI analysis after it became obvious the old way was the wrong way.

ISACA’s new guidance on cloud ROI reinforces many of these notions. The habit of conducting formal cloud ROI analyses may be coming to an end as traditional IT gives way to the cloud.

5 Helpful Tips for Better IT Change Management

Anna JohannsonAs you know, change management is critical to the long-term success of every organization. This is especially true when it comes to IT, where change happens at an astonishing pace. But is your organization where it needs to be?

Guidance for Your Change Management Strategy
There is something equally exhilarating and frightening about change. It is a necessary factor in moving forward and growing as a business, but it’s also unfamiliar and intimidating. Unless you have a strategy in place for managing change – particularly on the IT front – it’s quite likely that you’ll focus more on fear and anxiety than hope and excitement.

That being said, here are some tips to help you approach IT change management from a strategic perspective.

Plan ahead
Change management is all about planning ahead and being proactive. Once an issue already has occurred, or your organization finds itself in the midst of a major shift, it is too late. Doing damage control or trying to put out fires will take valuable energy away from other important tasks. Start early and always anticipate what will happen next.

Choose the right software
You don’t have to take on change management all by yourself. Automating some of the process with the right tools can make all the difference in the world. For example, a change management software, such as a help desk, can allow you to simplify the process by providing highly customizable solutions and automated processes to manage change requests and approvals.

It’s also helpful to have some sort of communication tool integrated within your change management software that allows you to reach all key stakeholders whenever and wherever they are. The sooner people are involved in the process, the faster you can get things moving on the right track.

Choosing the right tools becomes even more important the larger the organization is. This is something the California State University (CSU) system has discovered firsthand when it comes to making key changes to its IT system.

Any IT system change that occurs on the main campus has to also go through each of the 23 satellite campuses and the thousands of employees, faculty and students at these locations. So, whereas a small change might not have a major impact at the main campus, it can have drastic effects when compounded over two dozen campuses. In order to simplify the process and make it easier to manage, CSU uses an automated change management system from Cisco that allows upgrades to occur automatically across the entire system. The results at CSU have been far better than if change management were handled manually.

Focus on the outcome
While change management is all about taking the necessary steps to move from Point A to Point B in the most seamless and efficient way possible, the focus always has to be on the outcome. When it’s all said and done, change management exists to ensure your IT department is set up for future success.

Prioritize engagement
One of the biggest mistakes organizations make is assuming that change management is all about deploying the right technologies and setting up the appropriate processes. While these are certainly important components of change management, it’s actually your employees who have to execute.

“If these individuals are unsuccessful in their personal transitions, if they don’t embrace and learn a new way of working, the initiative will fail,” explains Prosci, a leading change management firm. “If employees embrace and adopt changes required by the initiative, it will deliver the expected results.”

You have to learn to prioritize engagement of all key stakeholders; otherwise, you’ll find it challenging to make any progress. Start preparing them early and often.

Be flexible
At the end of the day, change never happens as anticipated. You may have a perfect plan for what you think will happen – and even have complete buy-in from all individuals and departments involved – but something will inevitably go awry. A willingness to adjust will serve you well.

Overlook IT change management at your own peril
The word “change” probably evokes a range of emotions. Your mind may jump to past experiences of change that were negative or unwelcome. Or, perhaps you have had good experiences with change and get excited at the thought of doing something new. But regardless of your personal history with change, you need to prepare your enterprise for the future by developing a specific change management strategy. Embracing technology-driven change management is vital if you want to be successful in the modern business world.

Benefiting from Chaos in the Cloud

One of the biggest technology advancements in recent years is the expansion of the cloud, allowing users to have more space on their computers or mobile devices, with access to their documents, videos and pictures that are all conveniently stored in one place.

Similar to the commercial security system, the cloud can be used to ensure the safety of documents and other private information. Companies that use the cloud as storage and also as security take advantage of the unprecedented scalability, the quick deployment and the savings that come with it. There also are risks behind using the cloud that include a bigger chance of unauthorized access to private information, legal risks and a lack of control.

With so many people making the switch to the cloud, there are new opportunities for people both in business and in private employment. The cloud can cause confusion regarding who is actually in control. A business owner has control over what happens and how that business is run, but when it comes down to it, the vendor is the one with all the cards. For example, the vendor can change the pricing at any moment, and with clients depending on the services provided, companies are forced to pay whatever price to ensure those services will continue.

Vendors are having difficulty adapting to the changes caused by the outbreak. As they scramble to keep up, vendors can often lose control of the situation. In a survey by ScienceLogic, it was discovered that less than one-third of IT professionals actually have the control they need in order to keep their business efficiently moving forward.

Cloud use is improving faster than the organizations that control it due to security exposures and unnecessary financial costs. As concerning as that may be, the cloud also leads to new business techniques and opportunities that enable innovation. Businesses worried about the future want to know the best ways to help the company succeed. Sometimes this leads to moments of uncertainty and confusion. These moments can benefit a company by helping it and its employees succeed in different situations and environments.

Believe it or not, a degree of chaos can be effective. Companies that risk confusion and a lack of control often jump ahead in their industry. Businesses such as Gartner, Amazon Web Services, Microsoft and Azure have used the cloud as a service to their customers. Each business estimated and received an increase in revenue just by switching to the cloud.

In the business world, it is important to be updated when it comes to technology but even more important to be aware of management tactics. In this case, the cloud is both an advancement of technology and a useful management tactic. In order for a company to truly succeed, it needs to have a culture that thrives on new ideas and new technology. Organizations that stick to old, outdated ways often become overwhelmed when trying to gain control in the fast-adapting technological environment.

Technology clears the path for employees and companies to become part of an innovative business landscape. There are always risks when it comes to new technology, but taking the chance to learn the new developments can help a business take the lead in their field. The cloud provides a company with the chance to use the extra space as an opportunity to not only help the business succeed but also to help its employees discover new learning and business techniques.

Trusting the Cloud: HIPAA Risk Assessment for Cloud-Based Files

Cloud-based computing and storage is increasingly popular—to the extent that some companies are cutting hard drive space to encourage users to shift toward the cloud. And while the cloud is convenient, allowing your files to travel easily and across devices, that kind of convenience isn’t exactly what you want when it comes to protecting medical files. Is your cloud use secure enough to meet Health Insurance Portability and Accountability Act (HIPAA) standards? Here are some factors to consider.

A Quick Overview
There are a lot of cloud systems available these days, but the first thing you should do when choosing one is compare baseline HIPAA compatibility. Amazon S3, Dropbox and iCloud are not compatible with HIPAA practices out of the box. Most other major systems, including Box, Egnyte, Google apps, and CrashPlan Pro are HIPAA compliant. Identifying the outsiders reduces your choice of cloud systems, allowing you to focus in on the details of compliant plans.

EHR or HIPAA
In addition to cloud computing, many physicians are shifting to digital recordkeeping, using what are known as electronic health records (EHR) systems. These systems are great for centralizing patient data and encouraging collaboration across different medical practices that share the same EHR vendor. However, EHR requirements and HIPAA privacy standards aren’t exactly the same.

The first rule of managing EHR in accordance with HIPAA standards is that you should never trust an EHR vendor that says you don’t need to worry about their HIPAA compliance. Although your specific files may be HIPAA compliant, other practices used by external vendors may not be; for instance, their cloud storage security may be lacking. Additionally, although EHR systems have all the features needed to be fully HIPAA compliant, you’ll need to check to make sure they are properly configured. If necessary safeguards are turned off, your patients’ data may be at risk.

Don’t Play Hide and Seek
Rather than establishing thorough HIPAA compliant practices, some organizations still think that what is known as “security through obfuscation” is a valid system providing the necessary protections. Realistically, though, this is possibly the worst of all security practices. This kind of security focuses on hiding your computer network, but tends to disregard proper antivirus software.

Additionally, such practices tend to reveal other lacking security practices within the organization, such as indiscriminate file sharing (between virus-infected computers, no less). Simply hiding your network doesn’t count as securing your files – a skilled hacker can easily access even an invisible network.

BAAs Are Not Enough
Google has a great reputation in the cloud-computing world, and with health organizations with high security standards. This means that medical practices using Google apps often feel confident that their files are safe, as long as they’ve signed a Business Associate Agreement (BAA).

BAA agreements might keep your information safe on an internal level, but this agreement won’t help secure patient files when transferred to other digital environments. Instead, when transferring files, using end-to-end encryption is the safest bet. This system will keep your data HIPAA compliant, even when it leaves the Google cloud.

Consider Adoption Side Effects
It’s great to choose a new HIPAA-compliant cloud system for your business, but in our pursuit of better data management systems, we often forget to consider the human elements of adopting new systems. Before choosing a new system, then, it’s important to ask whether your employees will be able to effectively use the new system, and whether there are other options they may find more convenient.

This is a common problem for companies choosing between Office 365 and Google apps for their cloud computing activity. Both Microsoft and Google will sign BAAs that offer HIPAA compliance, but the two programs have different strengths. This is where considering use and convenience is important. If you work a lot with documents, you might think that Office 365 is the way to go—most of us came of age writing everything in Word, so why not? The main reason not to, it seems, is that Google Docs’ collaboration systems are helpful and the platform is more convenient. The reverse seems to hold for spreadsheets.

If you can’t get your team on board with a new computing system, no amount of security regulation in the world will help you. Be sure to clearly to tell your staff about organizations with which you have BAAs, the legal risks of using other systems, and their responsibility to patient privacy as health field employees.

1 - 10 Next