Self Service Data

Move Data

Self service of data is a reality.  No more waiting for others to provide data copies to you.  No more requests to the DBA team to refresh your analytics environment or set a restore point on your test database and later rewind it when things have gone wrong.  No more tickets raised with the middleware team to refresh your development application with a production copy.

I worked as a DBA for many years and one of the most common tasks I had to perform was the movement of data around the organisation.  In fact my very first job as a trainee was to refresh the management information system every Monday morning so the decision makers could work with the latest business data that week.  It would take me all morning and would mean no MIS available on a Monday AM.  When I look back I wonder why it was ok for a key information system to be unavailable for half a day a week.  There’s no way it would be like that today but there was no better solution at the time.

Back then my week always began with a walk to the computer room where I would retrieve the DLT (tape backup) that contained the previous nights backup of the source database and load it into a tape drive of the target server.  Then back to my office where I would kick off a restore of the tape and wait patiently (by the way, this was all Oracle databases hosted on OpenVMS – I do miss OpenVMS!).  The database was only a few GB but it would take over 2 hours to restore!  The remaining steps involved deleting the existing target database, recovering the restored database followed by a rename and then handing it over for a quick QA before releasing back to the business.  A little laborious to say the least.

Let’s fast forward nearly 20 years and see how the process has changed. Continue reading “Self Service Data”

Virtualise Your Application Alongside Your Data

Delphix Data Pod

 

 

 

 

 

 

 

Most of what we see and read about the Delphix Dynamic Data Platform talks about how we can move data around the enterprise and cloud rapidly and securely.  A key value driver of the product is the test data management (TDM) capabilities unsurpassed by anything else in the market right now – the ability to provide data fast, secure and everywhere.  And a key component that always impresses the students I teach on my Delphix training courses is the self-service features.  Handing off control of data to the consumer (the developer, tester, qa’er, data analyst) is a huge paradigm shift from what we’ve been laboriously doing for decades.

It’s not until you drill down further into the features that you begin to realise the many use cases the Dynamic Data Platform can address and often these are what I class as added benefits in addition to the key marketed features.  What I mean by that is you may not implement the product based purely on that use case but now you have it you may as well take full advantage of it.  The return on investment grows far greater than originally forecast (and the ROI on Delphix is pretty impressive anyway).

As I said, we talk mainly about data but what about the application itself?  Development and testing, especially in todays devops world is fast paced and access to up-to-date and relevant data is key.  But obviously we also need to work with the application code and often, backwards and forwards on many iterations of the application depending on priorities, resource availability, schedules as so on.

Delphix has a concept of the data pod.  What is a data pod?

A data pod is a set of virtual data environments and controls built by the Delphix Dynamic Data Platform and delivered to users for self-service data consumption

Again, it all points towards data but in actual fact a data pod can contain the application too.  So we can have all the TDM capability goodness with our application code as well as data.  Rapid lightweight provisioning, bookmarking, branching, rewinding, sharing to name a few.

Lets dig in to this and see how it fits together and look at some of the main benefits.

Continue reading “Virtualise Your Application Alongside Your Data”

Recovery From Logical Data Corruption Using Delphix

Logical Corruption

 

 

 

 

Delphix data virtualization can provide the solution to numerous use cases for the provision and consumption of data and when you begin to work with the product you can find yourself coming up with all kinds of weird and wonderful ways to utilize its functionality often replacing existing costly solutions and doing so quickly.

Here’s a use case you may not have thought of, or at least you didn’t when you virtualized and provisioned your first VDB or vFile.

Warning: this post is a little more wordy than usual!

Continue reading “Recovery From Logical Data Corruption Using Delphix”

Faults – Ignore or Resolve – Part II

Delphix 5.2 Faults

 

 

 

Following on from the post covering the new version 5.2 Delphix Management Interface I now need to write a part II to the Faults post, which explained the difference between ignoring and resolving faults in the Delphix virtualization engine and how to fix an accidentally ignored fault in versions prior to 5.2.  As I explained there, we needed to use the CLI where we could switch the fault to resolved, which would have the effect of the fault being alerted again if it occurred.  Now in Delphix 5.2 we can do this through the redesigned (and renamed) Management Interface.

Continue reading “Faults – Ignore or Resolve – Part II”