Webinar – Move To The Cloud

I’ll be running a live webinar on the 16th September 2020 in conjunction with Mark McGill from Delphix, where we will do a live demo of some really cool stuff with data virtualisation in the cloud.

Using Delphix with the public cloud (AWS in this case) we will demo how super easy it is to spin up secure dev/test/analytics environments of any size on demand.

I’ve been working on this as a solution for a customer over the last few months and the possibilities it opens up are numerous.  Gone are the days of the lengthy procurement process to get some tin in the data centre, or the request process to have VMs spun up, or the capacity planning headaches, or the on going costs even when you’re not using your infrastructure.

The public cloud solves all that but you mix it with Delphix and you also solve the problems of moving data around environments.  It literally takes minutes to spin up 1, 10, 100 new copies of a masked production database on server resources of your choice to meet your requirements right now.

I’m pretty excited about this so come have a look at the demo and see what you think.

Here’s the registration page.  If you can’t make it register anyway and you’ll be sent a recording to view at your leisure.

See you there.

 

AWS Certified Solutions Architect

/* begin rant

Finally, after three attempts at sitting the AWS Certified Solutions Architect – Associate exam I got to actually do it this week.  That’s not that I failed two times before, which I could have dealt with better, but the previous two attempts were remote exams via PearsonVUE where the online proctor didn’t show up!

It frustrated me enough the first time but the second time really did grind on me and I vowed not to try anymore online exams with PearsonVUE and wait for test centres to reopen, which some have now.

Continue reading “AWS Certified Solutions Architect”

Solving Puzzles

I didn’t realise it until I sat down and reflected on my most recent and arguably greatest ever (!) accomplishment that I am always solving puzzles in my professional life. We all are in some way on a daily basis.

Growing up in the eighties with two older brothers there was always a Rubiks Cube knocking about the house and I spent many an hour getting frustrated and never solving more than two faces of this simple but perplexing puzzle. I think at one point I teased all the stickers off, stuck them back in order and proudly presented my work to my oldest brother who clearly was not convinced.

Fast forward a “few” years… I saw a rubik’s cube on tv one Sunday morning a few weeks ago and realised I had unfinished business with this multi-coloured plastic cube of annoyance. I hate not seeing a challenge through and usually I persist until a solution is found, but it doesn’t usually take me 35+ years!

Continue reading “Solving Puzzles”

Delphix 6.0.2 Release

I don’t usually create a post just for new Delphix Dynamic Platform releases, especially minor releases, but this one does deserve some air time.

Over the years that I’ve worked with the platform there has been many feature enhancements on each new release, whether major or minor. The engineering and development teams at Delphix obviously work tirelessly to improve the product to meet customer requests and the latest (minor) release proves that.

There are four particular updates in the upcoming 6.0.2 release that I’m going to highlight, purely because they’ve been on my list of enhancements I’d like to see for a long time. I know for a fact that the customers I support will quickly upgrade to this release for some if not all of these updates.

Continue reading “Delphix 6.0.2 Release”

Delphix Self-Service Configuration

One of the key features of the Delphix Dynamic Data Platform is the ability, for the data consumer (Developer, Tester, QA’er, Data Analyst), to self serve data in an extremely functional and intuitive way.  Operations like refresh, rewind, bookmark, branch and share at the touch of a button provide the end user unprecedented speed and agility while working with the data they need, and without raising tickets and bothering the DBA’s and Sys Admins in the process.  Win win!

However, to take advantage of these rich features the Self-Service environment must first be configured by the Delphix Administrator and there are a some things to consider up front to ensure the Consumer get’s the data they want when they want.

Continue reading “Delphix Self-Service Configuration”

Maximum Performance Data Masking

The process of data masking by its very nature can be a lengthy operation – by this I mean the actual read, transform, update process performed by the masking tool. The time taken from start to finish is directly proportionate to the quantity and complexity of data to be masked.

No two datasets are the same. We have different data store technologies, different schema designs, different ratios of sensitive data contained in the data store. I am often asked how long the masking process will take for a given database before I’ve even seen it! There is no rule of thumb here. The only way we can give a useful estimate is by creating, running and timing the actual masking job(s) themselves.

What we can do though, and we must do, is design our masking configurations/jobs with maximum performance in mind. Performance tuning is often an after thought that inevitably forces rework whereas if we work from the outset with performance in mind we can ensure time and effort is kept to a minimum and most importantly the process runtime is too.

Let’s explore the key areas that affect performance in a typical Delphix data masking enterprise implementation.

Continue reading “Maximum Performance Data Masking”

Self Service Data

Move Data

Self service of data is a reality.  No more waiting for others to provide data copies to you.  No more requests to the DBA team to refresh your analytics environment or set a restore point on your test database and later rewind it when things have gone wrong.  No more tickets raised with the middleware team to refresh your development application with a production copy.

I worked as a DBA for many years and one of the most common tasks I had to perform was the movement of data around the organisation.  In fact my very first job as a trainee was to refresh the management information system every Monday morning so the decision makers could work with the latest business data that week.  It would take me all morning and would mean no MIS available on a Monday AM.  When I look back I wonder why it was ok for a key information system to be unavailable for half a day a week.  There’s no way it would be like that today but there was no better solution at the time.

Back then my week always began with a walk to the computer room where I would retrieve the DLT (tape backup) that contained the previous nights backup of the source database and load it into a tape drive of the target server.  Then back to my office where I would kick off a restore of the tape and wait patiently (by the way, this was all Oracle databases hosted on OpenVMS – I do miss OpenVMS!).  The database was only a few GB but it would take over 2 hours to restore!  The remaining steps involved deleting the existing target database, recovering the restored database followed by a rename and then handing it over for a quick QA before releasing back to the business.  A little laborious to say the least.

Let’s fast forward nearly 20 years and see how the process has changed. Continue reading “Self Service Data”

Virtualise Your Application Alongside Your Data

Delphix Data Pod

 

 

 

 

 

 

 

Most of what we see and read about the Delphix Dynamic Data Platform talks about how we can move data around the enterprise and cloud rapidly and securely.  A key value driver of the product is the test data management (TDM) capabilities unsurpassed by anything else in the market right now – the ability to provide data fast, secure and everywhere.  And a key component that always impresses the students I teach on my Delphix training courses is the self-service features.  Handing off control of data to the consumer (the developer, tester, qa’er, data analyst) is a huge paradigm shift from what we’ve been laboriously doing for decades.

It’s not until you drill down further into the features that you begin to realise the many use cases the Dynamic Data Platform can address and often these are what I class as added benefits in addition to the key marketed features.  What I mean by that is you may not implement the product based purely on that use case but now you have it you may as well take full advantage of it.  The return on investment grows far greater than originally forecast (and the ROI on Delphix is pretty impressive anyway).

As I said, we talk mainly about data but what about the application itself?  Development and testing, especially in todays devops world is fast paced and access to up-to-date and relevant data is key.  But obviously we also need to work with the application code and often, backwards and forwards on many iterations of the application depending on priorities, resource availability, schedules as so on.

Delphix has a concept of the data pod.  What is a data pod?

A data pod is a set of virtual data environments and controls built by the Delphix Dynamic Data Platform and delivered to users for self-service data consumption

Again, it all points towards data but in actual fact a data pod can contain the application too.  So we can have all the TDM capability goodness with our application code as well as data.  Rapid lightweight provisioning, bookmarking, branching, rewinding, sharing to name a few.

Lets dig in to this and see how it fits together and look at some of the main benefits.

Continue reading “Virtualise Your Application Alongside Your Data”