AWS Certified Solutions Architect

/* begin rant

Finally, after three attempts at sitting the AWS Certified Solutions Architect – Associate exam I got to actually do it this week.  That’s not that I failed two times before, which I could have dealt with better, but the previous two attempts were remote exams via PearsonVUE where the online proctor didn’t show up!

It frustrated me enough the first time but the second time really did grind on me and I vowed not to try anymore online exams with PearsonVUE and wait for test centres to reopen, which some have now.

Continue reading “AWS Certified Solutions Architect”

Delphix 6.0.2 Release

I don’t usually create a post just for new Delphix Dynamic Platform releases, especially minor releases, but this one does deserve some air time.

Over the years that I’ve worked with the platform there has been many feature enhancements on each new release, whether major or minor. The engineering and development teams at Delphix obviously work tirelessly to improve the product to meet customer requests and the latest (minor) release proves that.

There are four particular updates in the upcoming 6.0.2 release that I’m going to highlight, purely because they’ve been on my list of enhancements I’d like to see for a long time. I know for a fact that the customers I support will quickly upgrade to this release for some if not all of these updates.

Continue reading “Delphix 6.0.2 Release”

Enterprise DataOps Security

Kuzo Data has released a whitepaper on Enterprise DataOps Security where you can read why and how you can secure your enterprise dataops platform (read Delphix DDP).

Continue reading “Enterprise DataOps Security”

Delphix Self-Service Configuration

One of the key features of the Delphix Dynamic Data Platform is the ability, for the data consumer (Developer, Tester, QA’er, Data Analyst), to self serve data in an extremely functional and intuitive way.  Operations like refresh, rewind, bookmark, branch and share at the touch of a button provide the end user unprecedented speed and agility while working with the data they need, and without raising tickets and bothering the DBA’s and Sys Admins in the process.  Win win!

However, to take advantage of these rich features the Self-Service environment must first be configured by the Delphix Administrator and there are a some things to consider up front to ensure the Consumer get’s the data they want when they want.

Continue reading “Delphix Self-Service Configuration”

Maximum Performance Data Masking

The process of data masking by its very nature can be a lengthy operation – by this I mean the actual read, transform, update process performed by the masking tool. The time taken from start to finish is directly proportionate to the quantity and complexity of data to be masked.

No two datasets are the same. We have different data store technologies, different schema designs, different ratios of sensitive data contained in the data store. I am often asked how long the masking process will take for a given database before I’ve even seen it! There is no rule of thumb here. The only way we can give a useful estimate is by creating, running and timing the actual masking job(s) themselves.

What we can do though, and we must do, is design our masking configurations/jobs with maximum performance in mind. Performance tuning is often an after thought that inevitably forces rework whereas if we work from the outset with performance in mind we can ensure time and effort is kept to a minimum and most importantly the process runtime is too.

Let’s explore the key areas that affect performance in a typical Delphix data masking enterprise implementation.

Continue reading “Maximum Performance Data Masking”

Self Service Data

Move Data

Self service of data is a reality.  No more waiting for others to provide data copies to you.  No more requests to the DBA team to refresh your analytics environment or set a restore point on your test database and later rewind it when things have gone wrong.  No more tickets raised with the middleware team to refresh your development application with a production copy.

I worked as a DBA for many years and one of the most common tasks I had to perform was the movement of data around the organisation.  In fact my very first job as a trainee was to refresh the management information system every Monday morning so the decision makers could work with the latest business data that week.  It would take me all morning and would mean no MIS available on a Monday AM.  When I look back I wonder why it was ok for a key information system to be unavailable for half a day a week.  There’s no way it would be like that today but there was no better solution at the time.

Back then my week always began with a walk to the computer room where I would retrieve the DLT (tape backup) that contained the previous nights backup of the source database and load it into a tape drive of the target server.  Then back to my office where I would kick off a restore of the tape and wait patiently (by the way, this was all Oracle databases hosted on OpenVMS – I do miss OpenVMS!).  The database was only a few GB but it would take over 2 hours to restore!  The remaining steps involved deleting the existing target database, recovering the restored database followed by a rename and then handing it over for a quick QA before releasing back to the business.  A little laborious to say the least.

Let’s fast forward nearly 20 years and see how the process has changed. Continue reading “Self Service Data”

Virtualise Your Application Alongside Your Data

Delphix Data Pod

 

 

 

 

 

 

 

Most of what we see and read about the Delphix Dynamic Data Platform talks about how we can move data around the enterprise and cloud rapidly and securely.  A key value driver of the product is the test data management (TDM) capabilities unsurpassed by anything else in the market right now – the ability to provide data fast, secure and everywhere.  And a key component that always impresses the students I teach on my Delphix training courses is the self-service features.  Handing off control of data to the consumer (the developer, tester, qa’er, data analyst) is a huge paradigm shift from what we’ve been laboriously doing for decades.

It’s not until you drill down further into the features that you begin to realise the many use cases the Dynamic Data Platform can address and often these are what I class as added benefits in addition to the key marketed features.  What I mean by that is you may not implement the product based purely on that use case but now you have it you may as well take full advantage of it.  The return on investment grows far greater than originally forecast (and the ROI on Delphix is pretty impressive anyway).

As I said, we talk mainly about data but what about the application itself?  Development and testing, especially in todays devops world is fast paced and access to up-to-date and relevant data is key.  But obviously we also need to work with the application code and often, backwards and forwards on many iterations of the application depending on priorities, resource availability, schedules as so on.

Delphix has a concept of the data pod.  What is a data pod?

A data pod is a set of virtual data environments and controls built by the Delphix Dynamic Data Platform and delivered to users for self-service data consumption

Again, it all points towards data but in actual fact a data pod can contain the application too.  So we can have all the TDM capability goodness with our application code as well as data.  Rapid lightweight provisioning, bookmarking, branching, rewinding, sharing to name a few.

Lets dig in to this and see how it fits together and look at some of the main benefits.

Continue reading “Virtualise Your Application Alongside Your Data”

Managing Large Delphix DDP Estates

Ansible Delphix

 

 

 

 

 

 

 

The DataOps movement is certainly gaining momentum and, as a key supporting tool,  the Delphix Dynamic Data Platform (DDDP) is quickly growing with it.  More and more large enterprises are deploying tens or even hundreds of Delphix engines throughout their data centres in all parts of the world.  With that comes the inevitable challenge of managing such an estate and specifically with Delphix, as an example, managing system configurations, user accounts and privileges that are locally configured on each engine becomes unwieldy and time consuming.  For instance, the enterprise may have a global support model with a dedicated team of support engineers who look after the day to day running of the estate.  The support team take on a new engineer who needs access to every engine to perform their role and we really don’t want to manually access each engine individually to add a new support user account.

In this post I’m going to look at one way of creating a centralised management solution for such a scenario.  It can then be developed further to achieve much more, such as mass deployment of source and target environment configurations, dSources, VDB’s and any other object management requirements you may have. Continue reading “Managing Large Delphix DDP Estates”

Recovery From Logical Data Corruption Using Delphix

Logical Corruption

 

 

 

 

Delphix data virtualization can provide the solution to numerous use cases for the provision and consumption of data and when you begin to work with the product you can find yourself coming up with all kinds of weird and wonderful ways to utilize its functionality often replacing existing costly solutions and doing so quickly.

Here’s a use case you may not have thought of, or at least you didn’t when you virtualized and provisioned your first VDB or vFile.

Warning: this post is a little more wordy than usual!

Continue reading “Recovery From Logical Data Corruption Using Delphix”

Faults – Ignore or Resolve – Part II

Delphix 5.2 Faults

 

 

 

Following on from the post covering the new version 5.2 Delphix Management Interface I now need to write a part II to the Faults post, which explained the difference between ignoring and resolving faults in the Delphix virtualization engine and how to fix an accidentally ignored fault in versions prior to 5.2.  As I explained there, we needed to use the CLI where we could switch the fault to resolved, which would have the effect of the fault being alerted again if it occurred.  Now in Delphix 5.2 we can do this through the redesigned (and renamed) Management Interface.

Continue reading “Faults – Ignore or Resolve – Part II”