The Importance of Hospital Data Security

Healthcare providers are no strangers to data security. When it comes to HIPAA, for example, they’ve long known how important it is to keep patient information safe and secure from prying eyes.

With so much hospital information now stored in digital form, primarily in databases, the need for strong hospital data security has never been greater. But what are the risks of database intrusions? And how can hospitals make sure their databases are as secure as possible?

Database Security Risks for Hospitals

Practically all the information hospitals collect, store and use exists in any number of databases. Everything from patient information (EHRs) to patient satisfaction systems, lab processing to employee records, financial results to billing and payment processing – most any application you find at a hospital runs on a database.

With that massing of data comes a certain amount of risk, much of it unexpected. Hackers have been diligent in their search for structural weaknesses they can exploit in databases. The Office of Civil Rights reports that, in 2015, more than 112 million patient records were compromised. Cybercriminals continue to try to gain access to healthcare records, which they see as a valuable commodity.

One of the more recent trends is the abundance of ransomware, which cybercriminals use to hold data hostage while demanding a ransom that must be paid before the data is released. Some hospitals feel they have no choice but to pay the ransom – such as the nearly $17,000 one Los Angeles hospital paid in 2016 to regain access to its computer systems – or else suffer the dire consequences of starting over from scratch.

As more and more hospitals look to the cloud as a way to share data and streamline internal processes among their employees, cybercriminals are hard at work searching for ways to break cloud security and steal patient records and other data. Utilizing mobile devices like smartphones, tablets and laptops can also create security issues.

How Can Hospitals Keep Databases Secure?

There are a number of best practices hospitals can follow to strengthen security around their database information. Here are a some of the most common:

  1. End-to-end Encryption. Data is most vulnerable not when it’s created or stored but when it’s transmitted between devices. It’s important to make sure sensitive data is encrypted when it’s sitting idly in storage or being used in an application. It’s even more important that it be encrypted when being transferred, such as between internal systems and the cloud or between a network server and a mobile device.                                                 
  1. User Security. It goes without saying that strong passwords are essential for protecting data like patient information. Passwords should include special characters and be changed frequently. Even better would be a combination of a strong password and PIN code – such as those generated by a security token – to gain access to an application housing sensitive data.
  1. System Backups. Backing up data is a necessary step for any security best practice. If data is lost or stolen, it can be restored. But backups should be made on a different network from the live data. Otherwise, cybercriminals can highjack both the live data and the backup data. The backups should also be encrypted to prevent unauthorized access.
  1. Cloud Security. The convenience of the cloud is indisputable, as mobile devices make it easier for doctors and other practitioners to access patient information wherever they are. But cloud systems also create a riskier environment, where hackers can more easily intercept and steal data. It’s vital for hospitals to employ a cloud provider that understands and actively monitors cloud security.
  1. Securing Input and Output Files. Cybercriminals can target not just your databases and backups, but the files that ordinarily flow into and out of your databases. If these input and output files – such as reports and work files – contain sensitive information, they should be classified and secured to prevent cybercriminals from obtaining their contents.
  1. DBA Service Providers. When considering healthcare security solutions, hospitals might consider contracting with a database administrator (DBA). A quality third-party vendor can provide hospital IT solutions that both optimize database efficiency and protect the hospital from cyber attacks.

Hospital data security should be viewed as part of an overall strategy for managing database information for the best benefit of the hospital. A knowledgeable and sophisticated DBA expert can help hospitals protect their data. A DBA expert can also help hospitals use this data for business intelligence (BI) purposes beyond the scope of each individual database application.

HIMSS, the Healthcare Information Management Systems Society, is continually helping healthcare providers evolve and stay secure through the use of information technology. RDX will be exhibiting at HIMSS18, taking place March 5-9, 2018, at the Sands Expo Center in Las Vegas. We hope to see you there.

 

DBaaS or IaaS? Database Cloud Comparison

Introduction

Technology leaders are being inundated with a flood of new cloud architectures, strategies and products – all guaranteed by vendors and various industry pundits to solve all of our database challenges. This seemingly endless array of public cloud based DBMS offerings can quickly become bewildering.  One of the top questions our customers have is whether they should choose DBaaS or IaaS as their preferred database cloud architecture.  This post is intended to peel back the veil of the 2 primary cloud based DBMS platforms by providing readers with our experiences with IaaS and DBaaS architectures.

One of the benefits of working for a remote DBA services provider is that our shop’s collective knowledge is not constrained by any one organization’s technology implementation.  We have customers that have technology strategies that range from “bleeding edge” to “yesterday’s technology tomorrow.”

We know what products work and which ones don’t, what tech stack combinations play well together and what database technologies and features provide the most benefits for a given business or technical need.  In addition, we are required to administer virtually every database feature you can think of for every product we support, and we work with dozens of cloud systems.  This provides us with a wealth of knowledge that  includes cloud strategies, technologies, architectures, product offerings and vendor-specific features.

 DBaaS and IaaS Defined

Let’s continue our discussion by learning more about the two primary cloud architectures – Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS).  Since we are talking about database management systems, let’s use the term Database-as-a-Service or DBaaS to define that architecture.

We all have experience with on-premises systems.  We have to build the server rooms,  provide heating, cooling, redundant connectivity and power.  We are required to purchase, install and administer all of the components to provide the safest environment we can for our systems.

We are then required to buy the server hardware, install it and maintain it.  When it breaks or we want to increase horsepower, we have to open the chassis and work on the server components. We add CPU, memory, disk – whatever we need.  To perform those activities, we either have to take an outage or make plans to shift the system and workloads to another server to ensure availability.  We also buy and administer the OS and DB software we need to run our database- driven applications.

In addition, we evaluate, buy, install and support all of the other products we need, which often includes monitoring, security, auditing and third-party reporting products.  Look at the two stars in the the graphic above.  We have to buy and support everything- both hardware and software.

Let’s move on to the cloud. Most IaaS and DBaaS environments are multi-tenant, which means we are sharing the vendor’s compute and storage architecture with other customers.  In addition, depending on the architecture and vendor chosen, the system will vary in degrees of scalability, elasticity, automated administrative services and self-service.

The architecture that is the closest to on-premises is Infrastructure-as- a-Service,   the architecture defined in the bottom center of our graphic.  With Infrastructure-a- a-Service (IaaS), the vendor provides the compute and storage infrastructure and may offer some level of system maintenance activities. Customers have direct access to the cloud system, which includes both compute and storage components.  Think of it as a server, more often than not, a virtual server in the cloud.

You don’t have to  build your server support environment that provides air conditioning, light, multiple power providers, UPS systems, generators and redundant connections to the internet.  All of those features are provided by the vendor, but IaaS customers will continue to maintain full ownership of their software stack’s administration, including the operating system and database.   Customers install and administer their software of choice on the Infrastructure-as-a-Service platform.

It’s important to note that depending on the IaaS provider and the offering chosen, customers are able to take advantage of the vendor’s features to reduce the time to support the environment. Microsoft Azure, for example, provides builds you can use to get a jumpstart on provisioning a new DB environment.  However, you will need to tailor that generic build to meet your needs.

Now that we know that IaaS is pretty much a server in the cloud, let’s move on to PaaS, or in our case, DBaaS – Database-as-a-Service. DBaaS vendors provide all of the server environmental benefits that their IaaS counterparts do.

DBaaS providers increase their level of control and responsibility by assuming ownership of the operating system and database software as well as the hardware. DBaaS customers perform little to no operating system and database software administration.  The vendors will be constantly enhancing their architecture’s automation capabilities to further reduce the human labor involved to administer their environments.

The DBaaS vendors also make modifications to their database software for two reasons:

#1-is to ensure their product will work in their shared cloud  environment
#2- to leverage the benefits that the cloud, and their architecture, inherently provides

Geographic data redundancy would be an example as  it allows customers to leverage the cloud to more easily create DR and HA systems.  It’s important to note that as we discuss IaaS and DBaaS architectures, there can be a lot of variations in the vendor offerings.

Comparing On-Premises, IaaS and DBaaS Architectures

Each environment, on-premises, IaaS and DBaaS, has strengths and weaknesses that are inherent to their architectures. I’ve provided this comparison information for the 3 DB environments shops can choose from – on-premises, the database running on IaaS and the DBaaS offering.  Each environment has pros and cons, benefits and drawbacks.

IaaS allows customers  to maintain tighter administrative control over their environment.  They can also more easily leverage  their  favorite internal third-party products on IaaS systems than they can with DBaaS.  IaaS is just a server that is provided to you over the cloud.

Many of the third-party tools that include monitoring, security, application development, auditing –  can be challenging to integrate into a DBaaS architecture because of the modifications the vendors make to their systems. All of the DBaaS vendors provide monitoring tools. Some vendors, like Amazon, charge extra if the customer wants a more robust monitoring solution than what is offered in their base package. In general, DBaaS monitoring tools aren’t as robust as their on-premises counterparts, but they are catching up very quickly.

If the DBaaS provider determines that its own internal underlying software, OS or database needs a critical availability, security or performance patch, shops may not have a choice on its implementation.  If that patch requires an outage, the customer will need to schedule that outage and oftentimes before a certain date.  DBaaS offerings allow customers to more easily configure complex architectures such as high availability and disaster recovery. All of the major DBaaS providers offer geo data redundancy.

Leveraging DBaaS Environments to Reduce DBA Labor Costs

Migrating to a DBaaS environment does reduce the amount of time a DBA spends administering the database environment, but it doesn’t reduce that administrative time to 0. Customers will experience the most significant time savings in OS administration and hardware support.

DBAs do spend time installing, patching and upgrading the DBMS software as well as tuning the environment and setting up and monitoring maintenance and backup utilities. Many of these administrative tasks can be provided by the DBaaS vendor depending on the vendor and offering selected.

The majority of a DBA’s  time is spent working within the database systems themselves. DBAs  build schemas, grant security, assist developers with SQL and procedural program tuning, provide advice, enforce business logic using database features, tune application workloads and debug issues. DBaaS vendors don’t provide these services as part of their basic package. There are literally dozens (and dozens) of administrative activities that DBAs are required to perform in DBaaS environments.

It’s important to remember that although the vendor may provide the mechanisms and processes to automate administrative activities, this also does not reduce DBA support responsibilities for them to 0. Personnel may need to configure how they are to be performed and when they are scheduled. All of the major DBaaS vendors provide patching, maintenance utilities and automated backup processes, but it is up to the DBA to review scheduling, configure custom schedules and monitor the execution for all automated tasks.

An increasingly competitive DBaaS marketplace forces all vendors to maximize their product’s inherent feature set. Constant innovation and integration of new features that differentiate their product from other vendors is an absolute requirement for their continued competitive survival.

During my tenure in the IT field, I’ve found the following equation to be true:

New Features
+ New Functionality
+ New Technologies
+ New Architectures
 + New Business Requirements
 = Increased IT Support Complexity

As the cloud vendors add features to their DBaaS offerings, the products will increase in complexity.  The features may automate some of the administrators’ support  activities but, at a global level,  DBAs with a high-level of technical knowledge will continue to be needed  to support  DBaaS environments.

Here’s a quick example. Most cloud vendors provide features that allow administrators to more easily configure HA and DR environments, but it still requires an extensive knowledge of these technologies and business requirements  to configure, implement, administer and monitor.   The activities may be performed much faster in DBaaS environments than on-premises systems, but that foundational base of knowledge is still required.

Well-trained administrators will determine what regions are best to house the failover systems, work with developers and business users to select a failover strategy that meets their combined business, technical and budgetary needs, monitor the actual failover process, identify the root cause of the failover to ensure it doesn’t recur and restore the system to its original configuration once the problem is corrected.

Impact on Existing Change Management Procedures and Documentation

Because cloud environments are different than on-premises systems, organizations leveraging these new architectures may have to update their existing  change management processes and documentation. How long that takes and the number of techs needed for the project, depends on, once again, the cloud architecture chosen and how stringent the organization’s change management processes and documentation requirements are. Because DBaaS is administered much more differently than on-premises, it will have a greater impact than IaaS, which is just a server on the cloud.

Additional documentation that may also need to be modified includes: security, disaster recovery, monitoring, problem resolution, job scheduling, administrative best practices, repeatable processes, internal, industry specific, governmental regulatory compliance, etc…

The amount of documentation changes required will depend upon the breadth and depth of documentation your particular organization requires as a best practice. Once again, this is a greater impact for DBaaS.

Impact on Existing Toolsets

Shops migrating to DBaaS environments will need to identify all of the build, administration, monitoring and access tools that they use to interact with their on-premises databases. All shops usually have a couple of “must have” tools that are frequently used.  Administrators will need to identify which of their organization’s existing toolsets will continue to work in the cloud – and which ones won’t. The popularity of the cloud is driving most software product vendors to make sure their offerings work with cloud systems, but it is not something that should be taken for granted.

RDX’s recommendation is to create a list and verify that the preferred  tools will continue to work with the cloud versions of the database.  The majority of the tools will work with IaaS (remember, it is just a server in the cloud), but for DBaaS, shops will need to perform the evaluation to determine if they can integrate and the level of effort needed to integrate them.

Comparing On-Premises DB Feaures to IaaS and DBaaS

We learned that IaaS allows us to install on-premises DB software in the cloud.   There’s not much analysis required to verify that all of the features we leverage in our on-premises databases are available in the IaaS cloud. Before shops migrate their favorite on-premises databases to DBaaS environments, they will need to identify the list of DB features that are not available in the cloud DBaaS offerings.

Let’s compare on-premises SQL Server to the 2 leading cloud vendors, Amazon and Microsoft, both of which offer SQL Server DBaaS. Amazon’s SQL Server DBaaS offering doesn’t support PolyBase, stretch database, backing up to blob storage and importing data into the msdb database. Also, administrators can’t rename a database if it is used in an Amazon Mirroring deployment, and it doesn’t allow customers to increase storage on a SQL Server database.  If the customer needs to increase the storage of a SQL Server DB Instance, they are required to back the database up, create a new DB instance with increased storageand then restore the databases into the new DB instance.

It doesn’t support SSIS, SSAS or SSRS, and it doesn’t have SQL Server server-level security roles for sysadmin, serveradmin, securityadmin, dbcreator and bulkadmin.  It also doesn’t support database mail, maintenance plans, distributed queries, log shipping, change data capture, SQL Server Audit or bulk insert.

How about Azure SQL DB vs SQL Server on-premises? These are two DB products offered by the same vendor. Microsoft’s Azure DBaaS offering doesn’t support attaching a database, backup and restore statements, change data capture, database mail, database mirroring, database snapshots, extended stored procedures, filestream, linked servers, log shipping, resource governor, the profiler, SSIS, SSAS or SSRS.

Now instead of database mirroring and log shipping, it does provide Active Geodata-Replication, which could be a better alternative for many customers – but its different! I’m not stating that these environments aren’t as effective as on-premises, but they have different features that we all need to be aware of.

Wrapup

Recent RDX customer surveys have shown that although most of our clients have defined their high-level cloud migration strategies, they are continuing to evaluate and compare IaaS and DBaaS platforms.  Some of our smaller and mid-sized customers have decided to go “all in” and migrate all of their systems to the cloud, but the majority of our customers has stated that they will execute a “best fit” strategy which includes implementing  new databases that use on-premises platforms as well as IaaS and DBaaS architectures.

In addition to choosing the most appropriate cloud architecture for a given database-driven application, there are a host of other considerations that must be evaluated when migrating to the cloud. Will you need to transfer data into and out of that cloud DB environment?  How much data? Does the database being migrated depend on data from other on-premises systems? How will you ensure that the database and data transfers are secure? What level of application changes are you comfortable with?  Are you permitted to have a downtime for the migration or is it required to be a “flip the switch” process? This is just a quick sampling.  An entire article could be written on all of the issues that must be considered and evaluated when migrating to cloud architectures.

RDX has successfully converted dozens of on-premises systems to the cloud and changed DB products along the way.  RDX offers a wide range of cloud DB services – from strategic analysis and architecture design to migration and ongoing support.  If you would like our assistance, please visit our Cloud Services page for more info.

 

 

Healthcare Businesses Intelligence Use is Growing. Why?

If you’re a healthcare provider and you’re not using healthcare business intelligence to drive your success, then you may be losing out on some important advantages. The benefits of business intelligence (BI) for healthcare providers are many.

Here’s why healthcare BI is growing and how it can positively impact your financial bottom line, improve the quality of your patient care, ensure healthcare security and aid in compliance with government-mandated regulations.

Not Just Big Data, But Massive Data

Healthcare organizations are producing and storing more data than ever before. From electronic health records to financial performance, from patient satisfaction results to industry health statistics, all this information forms a massive amount of data. Add to that the policy information from insurance companies and clinical trial results from pharmaceutical companies and you get what many experts describe as some of the most complex data warehouses on the planet.

Healthcare providers can use BI to process, present and analyze this enormous amount of data in meaningful ways. It can help them to track progress toward strategic goals, offering big-picture guidance that makes it easier to steer organizations in the right direction.

Skyrocketing Healthcare Costs

With healthcare costs rising at unprecedented levels, it has become increasingly important for providers to manage those costs. BI allows healthcare providers to gain more visibility into their financial operations, improving operational efficiency. It helps organizations monitor cash flow and make the proper adjustments to streamline it.

BI may also help healthcare providers identify services that are highly profitable or underutilized, generating the analysis they need to set pricing, control expenditures, allocate staff time and more easily process claims. BI can help organizations determine the effectiveness of their marketing efforts as well.

More and Better Healthcare

Per capita healthcare in the U.S. is among the highest in the world. With that expenditure comes an expectation for better outcomes from healthcare and higher patient satisfaction.

BI can help healthcare providers improve patient care and analyze quality and safety trends. It can offer information that allows providers to better manage logistics, such as patient flow in triage and optimal patient discharge times, making best use of bed space without sacrificing patient outcome. It can provide the evidence physicians need to make clinical decisions and to better monitor and predict patient diagnoses.

Mushrooming Compliance

Federal health reforms have led healthcare providers to take regulatory compliance seriously, especially as more quality-based requirements are implemented. Complying with HIPAA rules concerning patient privacy and information security is particularly important.

To meet the demands of complex regulations, healthcare providers must have quick access to many different kinds of financial and patient data. BI can help collect, organize, analyze and present data to regulators, making it less likely that providers will be audited, sanctioned or otherwise fined for noncompliance.

Getting Expert Help with BI

A comprehensive BI strategy is essential for healthcare providers. While some organizations may think of BI as simply a pretty way to present data, it’s really much deeper than that. A sophisticated third-party BI administrator can more fully optimize the BI strategy of healthcare providers and give them the results they’re looking for in a BI solution. They can help healthcare providers drive real competitive advantage from their IT by providing the expertise, services and scale they need to optimize the business value of their mission-critical technologies

RDX – the #1 provider and pioneer of remote DBA services – offers Microsoft BI services to help healthcare providers leverage Microsoft’s industry-leading BI product suite, turning raw data into revenue. From strategic planning to production and beyond, our BI analysts are with you every step of the way, managing your solutions, empowering your success and maximizing your BI investment.

HIMSS, the Healthcare Information Management Systems Society, is continually helping healthcare providers evolve through the use of information technology. RDX will be exhibiting at HIMSS18, taking place March 5-9, 2018, at the Sands Expo Center in Las Vegas. We hope to see you there.

 

Google & Cisco are Partnering Up with a Hybrid Cloud Solution

Question: What do you get when you pair the largest networking company in the world with one of the premier cloud storage providers? Answer: Cisco-Google hybrid cloud services.

The two companies will be working together on a comprehensive solution to develop, run, secure and monitor customer applications and data. The Cisco-Google partnership will offer hybrid cloud services that allow customers to plan their cloud migration in accordance with their own strategies and within their own timeframes. It’ll prevent companies from being locked into expensive, outmoded or unmanageable systems. It’ll also help to maximize any investment companies make in cloud technology.

Bringing the Power of the Cloud In-house

Exactly what this solution will look like has yet to be detailed, but the goal of it is to bring the power of the cloud in-house. Once rolled out, developers and other IT professionals will be able to take advantage of Google’s secure cloud-storage tools, using them seamlessly to create applications within their own internal systems environment.

Both Cisco and Google have said that it’s important for customers to be able to take their networking and security capabilities with them when utilizing cloud resources. They say that any multi-cloud solution must also include support for customers’ policy requirements, as well as the ability to get real-time networking and performance data.

Managing Applications Like Never Before

Many companies rely on a combination of public and private cloud services, but there are significant differences between the two. Because of those differences, it’s difficult for developers to write applications that can move easily between environments. Developers must learn to operate in each environment separately. They’re not able, for example, to develop an application in the public cloud and deploy it in the private cloud.

Some applications belong on-premises, and some belong in the public cloud. But what if those applications could all work together? What if applications were able to extend across environments, where they can take advantage of applications and services housed in other data centers and clouds?

The Cisco-Google partnership will offer open-architecture hybrid cloud solutions that allow customers to better develop and manage applications either in-house or in the cloud. It will help developers to make use of open source platforms, such as Kubernetes and Istio, GCP Service Catalog and service mesh monitoring.

Cloud Expertise Gets Companies in the Game, Keeps Them There

The roll-out of the new hybrid cloud solution is expected to occur in early 2018 with a limited number of customers, followed by a full roll-out later in the year.

In the meantime, companies considering Google-Cisco hybrid cloud services would do well to consider taking advantage of the hybrid cloud expertise third-party database managers have to offer. Whether companies are considering the cloud or are already there, a knowledgeable DBA or cloud expert can help them get the most from their cloud investment.

RDX – the #1 onshore provider of remote DBA services – has the people, processes and technology required to speed and simplify cloud deployment, optimize assets and help you achieve business benefits faster. From needs analysis to production deployment, our cloud solutions help you successfully navigate every phase of your cloud journey. Once you’re in the cloud, we can help solidify your presence there with reliable, secure monitoring and administration services. Contact us today for more information on how you can make the cloud work for your business success.

Pros and Cons of Cloud Storage for Businesses

Each year, more businesses turn to cloud solutions for storing business files – documents, spreadsheets, images, etc. As security in the cloud becomes stronger, especially through token and encryption techniques, business leaders have warmed to cloud storage as a way to cut costs, create efficiencies and take advantage of third-party expertise. Forbes estimates that, by 2018, at least half of IT spending will be cloud-based. Before business leaders move their company information offsite, there are a number of cloud storage pros and cons worth considering.

Why Businesses Store Files in the Cloud

Companies realize a number of obvious benefits of cloud file storage:

1. Cloud storage allows users to work wherever they are. Documents are shared seamlessly across office locations, facilitating better collaboration among staff. Employees can access information whether they’re at the office, at home or on the road. All that’s needed is a laptop (or other electronic device) and an internet connection.

2. It may save on company bandwidth. When files are routinely emailed back and forth, it can stress a company’s IT infrastructure. Like a crowded highway, it slows down traffic and creates storage challenges. Storing business files in the cloud, however, places the burden of keeping traffic flowing squarely on a third-party provider. Plus scalability can be accomplished almost instantaneously.

3. Cost savings are another big benefit of cloud storage. Cloud providers distribute storage and service costs across many businesses, passing much of the savings along to their clients. Storing files in the cloud enables most businesses to cut back on hardware and maintenance expenses, including labor costs.

4. When a business suffers a catastrophic event – a fire, natural disaster, data hijacking or total systems crash – cloud storage is a reliable way to manage disaster recovery. Files are stored securely offsite and can be easily and effectively replaced.

5. Most cloud storage platforms have the added benefit of applications and other tools that allow businesses to optimize their information. Many cloud providers make it possible for businesses to stream files (such as audio or video) directly to customers, upload and download content directly from mobile apps and websites, host and serve the static assets of websites, and run data analysis programshave developed business intelligence (BI) analytic tools that help companies sort, understand and use the big data they accrue throughout the course of doing business.

Considerations of Cloud Storage

Security has always been a top consideration when it comes to storing information in the cloud. Businesses with sensitive information may be especially concerned about public cloud security, as firewalls between company information and that of other businesses could, theoretically at least, fail.

Despite the improved security most public cloud providers demonstrate today, some data-sensitive businesses have resigned themselves to internal or private cloud storage of business files. A private cloud may be good for companies that have already established their own data centers, but it does require on-premises IT staff to manage it, not to mention the expense of maintaining hardware.

Other businesses prefer to use a storage service that’s fully administered by someone else but still has an increased level of privacy. These operate a lot like the other public cloud storage services but with one major difference: the data is encrypted and stored in a way that nobody except the business can access it. Even the employees of the service can’t access the files.

Some businesses are now using a hybrid of private and public cloud services. Certain files are stored locally, while other files are deployed to the cloud. A hybrid cloud solution must meet certain key requirements to make it work. For example, it must appear seamless to users. Hybrid clouds also depend on policy engines to define when specific files get moved into the cloud or pulled from it.

Cloud Storage Options

Cloud storage options are expanding rapidly, as more and more vendors enter the market. For storing flat files, businesses often look to Amazon (AWS), Microsoft (Azure) and Oracle (Cloud File System and Cloud Infrastructure Storage) for various products or platforms that address their individual needs.

Companies looking for help in selecting, integrating and taking full advantage of these platforms might find it beneficial to engage a technology consultant. As the #1 onshore provider and pioneer of remote DBA services, RDX puts technology to work for business worldwide. From needs analysis to production deployment, RDX’s cloud solutions help businesses successfully navigate every phase of the cloud journey. RDX enables customers to fully leverage the inherent benefits of cloud architectures and assists them in overcoming some of the more challenging cloud activities.

Google Cloud Security Best Practices

Many businesses employ powerful cloud platforms such as Google Cloud to store company data – spreadsheets, documents, databases, images, applications, software and much more. But some businesses contemplating Google Cloud Platform wonder: Is Google Cloud safe? Is data stored in Google Cloud secure?

The answers to these questions appear to be “yes,” as long as companies take extra steps to make certain that data won’t be compromised. Below are some Google Cloud security best practices that business leaders should be aware of when working with their teams to ensure the security of company information.

Why Use Google Cloud?

As might be expected, Google has put a lot of resources into making Google Cloud a state-of-the-art storage platform. Unlike some upstart cloud providers, Google Cloud has evolved a reputation for quality data storage performance. Accessible anywhere in the world, Google Cloud claims to be “cost-effective and constantly improving.” It delivers all the high-performance infrastructure companies need for storage of their data.

Google Cloud also includes powerful tools for analyzing big data. Companies generate all kinds of data that can be useful in tracking business transactions, identifying customer or client trends, pinpointing inefficiencies in systems, and making informed decisions on the future of the business. Google Cloud’s customizable business intelligence (BI) analytics can propel businesses to greater success.

Keeping Google Cloud Secure

Along with Google Cloud’s respectable track record comes a commitment to security. Google works strenuously to ensure that data stored in Google Cloud is protected from intrusions. Although a public cloud platform has inherent security risks that a more expensive private cloud would not, Google nevertheless understands that the security of company data must be a major concern of any cloud platform.
While Google is doing its part in keeping data protected, security is a shared responsibility. Here are some Google Cloud security best practices companies should take into account:

Data Classification. Data has different degrees of sensitivity. Classifying data allows companies to categorize any data that should be restricted from wider distribution or otherwise confined to certain user groups. This is especially critical for data containing private information that could identify a specific individual, whether an employee or a customer.

Access Control. Companies are responsible for controlling access to data. It’s important to set user permissions at the project and application levels. This includes preventing end users from sharing critical information outside the corporate network or public cloud infrastructure.

Password Protection. Insisting that users have strong passwords is always a security best practice, especially when working in the cloud. Passwords should be as unpredictable and as random as possible. Two-factor authentication solutions (such as a password and token, password and emailed code, password and fingerprint) make it even harder for attackers to gain control of an account.

Data Encryption. Encrypting data is essential for creating a secure working environment. This is especially important when transferring data into or out of the cloud. Employing strong encryption, at all phases of data management, makes it more unlikely that data will be compromised.

Vulnerability Testing. It’s also important that data environments be routinely checked for vulnerability. Vulnerability assessment and penetration testing (VAPT) look for code flaws and application leaks that might make data insecure. If a vulnerability is found, it should be reported to Google via the Vulnerability Reward Program.

Security Sharing with Consultants. When hiring cloud platform consultants, companies should look for ones that takes security and privacy seriously. Any third-party vendor that handles business information should have the highest certification available when it comes to security processes.

At RDX, security is of paramount importance. Although we don’t store or process any data for our customers, we adhere to one of the most comprehensive security and privacy frameworks in the IT industry and have audited every security control possible within our organization. We reduce the risk of business disruption by leveraging RDX’s expertise and controls – which includes SSAE16, AICPA SOC 2 and PCI DSS compliance – ensuring the security, availability, integrity, confidentiality and privacy of data and transactions. You don’t become the #1 provider and pioneer of remote DBA services without paying close attention to data security.

ICYMI- RDX Insights Webinar: Microsoft SQL Azure Overview & Demo

This month, RDX’s VP of Delivery Strategies and Technologies, Chris Foot, and RDX Azure Expert, Jim Donahoe, teamed up to present Microsoft SQL Azure Overview and Demo.

During the webinar, Chris and Jim took a deeper look at some of Azure SQL DB’s most popular and interesting features in addition to how the product differs from its on-premises and IaaS counterparts. They also covered a wide range of topics from purchasing and provisioning to geo-replication, sharding and advanced automations.

The presentation concluded with a demo from Jim explaining how to:

  • Deploy a DBaaS instance
  • Configure a DBaaS firewall
  • Configure resource locks
  • Use Query Performance Insights(QPI) to analyze DBaaS workloads
  • Perform failover group configurations and use cases

If you missed yesterday’s webinar, you can view a copy of the slides on SlideShare.

You can also view a live recording of the presentation below:

You can join our mailing list for updates about future RDX Insights Series presentations by emailing info@rdx.com.

What Is a Strong Password?

It goes without saying that the security of any company’s business information is of primary importance. Whether or not sensitive customer information is actually stolen, any breach in company data makes security appear weak, can scare away customers and may eventually lead to a company’s demise.

Most systems and applications dictate what the minimum security standards are; one system might simply require 8 alphanumeric characters while others may require longer ones with additional parameters, such as the inclusion both upper and lower-case letters and the forced exclusion of publicly available personal data, such as a user’s name.

Oftentimes, a person’s password strength will conform to the minimum standards required.

There are many security measures a company can take to protect the business against information attacks. One such measure lies within control of every individual user: password security. Regardless of whatever the minimum standards of a system are, individuals should strive to create the most secure passwords they can. Here’s some password advice from experts in the field.

What Most People Think Is a Strong Password (Really Isn’t)

Password security has been a business priority for a long time. Users are instructed to generate passwords that would be difficult to hack. But just how strong are those passwords?

Some experts believe it hardly matters. They argue that hacking software has become so sophisticated that it can decode pretty much any password users create. There have been instances, too, where businesses have required users to maintain complex passwords only to have hackers break in and steal a list of the company’s passwords that was never encrypted on the server.

These skeptical experts advise businesses to instead put more of their energies into locking down systems, strengthening firewalls, encrypting data, employing two-factor authentication and putting clear procedures in place that prevent hackers from getting in and information from getting out. While these are all best practices businesses should definitely follow, other experts continue to believe in the importance of password protection as a central way to safeguard business information.

Why Passwords Fail

What makes a weak password? Passwords fail for any number of reasons, but the most common one is that they’re too predictable. Anytime users include familiar words or phrases or identifiable numbers, the password has a good chance of being hacked. Using a street name and house number, for example, would be like putting the welcome mat out for hackers. Same with using surnames, maiden names, parents’ names, kids’ names, pet names or any number of other recognizable monikers.

It’s not that hackers know who users are and where they live, but the algorithms they employ to break into systems are very good at guessing. Hackers can process password attempts automatically and at lightning speeds. Without strong passwords, companies might as well just give away their information.

What Is a Strong Password?

For better password security, users should take into account all of the following:

1. A strong password should be at least 12-16 characters in length – the longer the better.

2. It should be a combination of upper- and lower-case letters, numbers and special characters.

3. It should include unrecognizable strings of letters (i.e., words not found in the dictionary). Foreign or nonsense words can be useful. It’s not enough to simply replace letters in common words with special characters. “$pring&$ummer,” for example, wouldn’t be very strong.

4. Mix it up as much as possible. The more random the better. The problem with random passwords is that they’re hard for users to remember. One solution is for users to create unusual acronyms only they would know. For example, take the phrase “My parents live at 445 N. Locust Street in Elizabethtown, Pennsylvania.” The password version of this might be: “Mpl@445N.LSinE,PA” – a strong password.

5. Avoid using the same password in many different places. Again, users have trouble remembering lots of different passwords and tend to rely on a few choice ones. Hackers know this and will try to exploit it. One solution is to use a password manager service. A password manager will create a strong password for each application and then store it in encrypted language. The user needs to remember only one password (hopefully a strong one) that tells the password manager to unlock or log into any application.

Business leaders should ensure that their company employs best practices in preventing data breaches. That includes procedures for designating strong passwords that stymie hackers.

At RDX, security is of paramount importance. Although we don’t store or process any data for our customers, we adhere to one of the most comprehensive security and privacy frameworks in the IT industry and have audited every security control possible within our organization. You don’t become the #1 provider and pioneer of remote DBA services without paying close attention to security issues, especially as they evolve in the future.

Business Intelligence Trends That Will Impact Your Business

The future of business intelligence (BI) promises better access to information, more ways to analyze data and make decisions, and superior methods of reporting results to managers and stakeholders. A modernized, robust BI platform can mean the difference between a company lagging behind or leading the pack. Medium and small-sized businesses especially are joining the BI revolution.

Here are a number of the latest business intelligence trends that may impact a company’s ability to attract and keep customers or clients, while ensuring the smoothest possible operation of the business:

1. Data Discovery Tools. One BI trend sure to be on the radar of every business is data discovery. Data discovery steers business leaders toward new ways of seeing and analyzing company data.

As data becomes more complex, it makes better sense to show it in more interactive and visual presentations. Text and tables of hard-to-consume data are being replaced by charts and graphs that can be flexed and manipulated. These new interactive visualization tools enable decision-makers to see major business trends and to spot problematic issues more quickly.

2. Self-service Business Intelligence. It used to be that when company decision-makers wanted to see business data, they would have to request it from the IT or finance departments. What the decision-maker received may or may not have been useful to them. With self-service BI, business leaders can get instant access to customizable data wherever they are.

More and more, business leaders appreciate the ability to analyze and act on data right at their fingertips. Users can modify their own dashboards, design their own queries, customize their own models, create their own reports and take advantage of other functionality without involving IT or finance staff.

3. Predictive Analytics. Business forecasting has always been important, but taking some of the guesswork out of forecasting can pay off big time for businesses. Predictive analytics is a BI trend that examines complex enterprise data and makes assumptions and projections on future probabilities.

Predictive analytics has many uses when it comes to BI, including: determining the value of prospects and customers; forecasting price, sales and demand; predicting machine failures; and monitoring and evaluating social media. By using predictive analytics, business leaders can better understand the inefficiencies or strengths of their companies and improve future company performance.

4. Data Quality Management. We’ve all heard the prophetic adage “garbage in, garbage out.” This is especially true for BI. Without reliable, accessible and coherent data, BI might produce false assumptions and spawn poor decisions that harm the company.

Data quality management uses a number of parameters to ensure the integrity of information. These include: completeness, accuracy, validity, uniqueness, consistency and timeliness. Managing the quality of data is vitally important to making the right company decisions and optimizing success.

5. Cloud Analytics. While many businesses already use cloud solutions for data storage, the cloud also provides incredible opportunities for analyzing business information. Employing cloud-based data analytics offers companies up-to-the-minute technology without having to develop expensive in-house expertise.

A cloud BI system can be implemented within a very short time frame and at significantly lower costs than an in-house BI platform. It requires no additional hardware purchases or IT resources, and future costs are much easier to predict. Some businesses have been reluctant to embrace the cloud, but recent improvements in security and infrastructure has virtually wiped out any concerns businesses may have had in that respect.

With BI becoming increasingly important to business success, RDX is committed to helping businesses gain a competitive advantage by providing the expertise, services and scale they need to maximize the value of their BI. Since our inception in 1994, our remote DBA services have helped hundreds of companies improve the quality of their database environments while reducing the costs associated with on-site database management.

We also help clients leverage Microsoft’s industry-leading BI Product Suite (SSIS, SSAS, SSRS) and Power BI, enabling them to turn raw data into revenue. Our customized, scalable Microsoft BI solutions allow companies to unlock the power of data to delight customers, drive differentiation and make better, faster business decisions. It’s why we’re the #1 provider and pioneer of remote DBA services.

ICYMI- Microsoft Business Intelligence Overview and Power BI Demo

October’s RDX Insights Series presentation was an updated version of May’s Microsoft Business Intelligence Overview and Power BI demo. This month, RDX’s VP of Delivery Strategies and Technologies, Chris Foot, and Product Manager of Business Intelligence and Data Warehousing, Jeremy Frye, teamed up to present:

    An overview of the Microsoft BI product suite
    Advantages of using Microsoft’s BI products
    A Power BI demo and how it integrates with SSIS/SSAS/SSRS
    How to use Power BI to capture, model, transform and analyze key business metrics

If you missed this presentation, you can view and download a copy of the slides on SlideShare.

You can also view our video recording below:

Next month’s RDX Insights Series presentation will be a Microsoft SQL Azure Overview and Demo. We’ll discuss the monitoring, performance, security and availability features available within Azure in addition to how Azure environments differ from their on-premises counterparts. The presentation will conclude with an Azure demo from one of our SQL Server Azure experts, Jim Donahoe.
You can join our mailing list for updates about future RDX Insights Series presentations by emailing info@rdx.com.