Tuesday, August 16, 2016

Database Olympics - Training for the Medals?

I will admit, I have been inspired by watching the Olympics. The work, training and even reinventing that the athletes do to be successful at the competitions. I enjoy sports and have participated where I could but accepted the fact over the years that being short and now getting older has created some physical limitations. It is still fun, and if you didn't know this about me, swimming was my sport that helped me fund my schooling (as a coach). I was even a flyer and a sprinter, not to keen on long distances, though a nice long slow swim is very relaxing to me now. I do get excited about the swimmers doing well and watching how the sport has changed over the years.
The photo for my blog here is even me standing next to Michael Phelps just a "few" years ago...
One might even say that I temporary retired from writing blog posts (considering how old the last post was), but now feel that this part of my new training plan. I have been impressed how he has refocused and set goals in order to accomplish what he has this year. Katie Ledecky, even though competing in the longer distances, has dominated her events, amazed by how hard she works and trains, and seen earlier photos with Phelps and might have used meeting him for some encouragement to work harder.
But this isn't an article about Phelps, it is about what we can learn and be inspired to do as Olympians in our own field. Yes, I said that we are Olympians. There is even a TV commercial out there with Kayla Harrison (awesome Judo athlete) saying that things we do normally earn us medals. (If you don't know anything about Kayla, google her and "This is My Day").
So, what are we training for, any upcoming events? Where do we have to do things differently? What conditions have changed? If we are looking at database environments, there is so much growth and potential here in these environments because of new technologies and business needs. Databases are being provided as services where self-provisioning has definitely changed the landscape for the DBAs, might even consider that DBAs are needing to adjust just because the skills are getting older and newer (younger) options are coming. The knowledge is needed to provide data intelligence and still provide highly available, well-performing and secured data sources.
We might be preparing for the migration or upgrade (12cR2???) event coming soon. Our training would be learning the new features, testing our environments and getting prepared to succeed in that event. Just image celebrating those victories with medals or other types of awards.
Doing things differently, we might adapt new technologies or automate parts of our job in order to focus on different areas to work more with the business to provide the data solutions.
Data security is a higher priority, which I have been focused on to look at access controls, data protection and how to validate and monitor the controls are in place and continue to be effective.
Processes, new technologies, working differently are things I have heard as the athletes that have come back year after year are doing in the Olympics, and why shouldn't we embrace that for what we do and how we are working in our environments.
Validate the processes and controls, work on using services and automation to not have to constantly repeat tasks to be "faster" at delivery and continue to learn and if needed refocus on areas that meet business requirements.
Let's be inspired to learn more, work more efficiently and celebrate what we do.

Thursday, October 10, 2013

Proof in the Privileges

Asking for full permissions on the database, tables or schemas may have been an easy way to do upgrades or run other database applications, but it provide a secure environment and it definitely doesn't get to the level of least privilege. Least privilege is a way to minimize access to sensitive data and granting only the permissions needed is a key step for database security.
Knowing what permissions have been granted is easy to find out by querying the catalog tables and auditing changes in grants. With Privilege Analysis we can also verify what permissions are actually being used. This analysis can provide roles that only have the permissions used or scripts to revoke the used permissions. Nice to have proof that only an SELECT or EXECUTE on a few objects is required instead of DBA role.
So how does it work? The DBMS_PRIVILEGE_CAPTURE is used to gather the information around what permissions are used and generate the results. The tables DBA_USED_PRIVS and DBA_UNUSED_PRIVS are some of the tables that contain the information from the capture.

DBMS_PRIVILEGE_CAPTURE.CREATE_CAPTURE (NAME => 'dba_capture_all_privs', DESCRIPTION => 'privilege_analysis_example_for_all_users', TYPE => DBMS_PRIVILEGE_CAPTURE.G_DATABASE);


There are options that will capture different privileges for the database, roles and context: G_DATABASE, G_ROLE, G_CONTEXT and G_ROLE_AND_CONTEXT.
Creating scripts to create a role or revoke privileges can be done with a query against the DBA_USED_OBJPRIVS or DBA_UNUSED_OBJPRIVS tables.

It is great to have proof of what privileges are being used and being able to do the analysis to help secure the environment.

There are several layers of security that are important for the environment and to get more information about Oracle Database security a few of us have written an ebook, Securing Oracle Database 12c: A Technical Primer, with references and with 12c information. It is available free for a limited time by registering at  www.mhprofessional.com/dbsec and use code: db12c. There are also a few more examples on Privilege Analysis. Be sure to check it out as there some excellent examples on auditing, encryption, handling privileged users and many others.

Monday, August 5, 2013

Pluggable databases - logging in

User group technology days and meetings are not only great for the presentations but for the hallway conversations. In a short discussion in the hall, I realized I had forgotten to mention something in my presentation the other day about pluggable databases, an Oracle database 12c new feature, and didn't even realize until having this conversation that it would be of interest to others. It is something that is fairly simple once you figure it out, but can cause a few minutes of distress or a couple of hours of doubt of how one is even a DBA and surviving.
With the pluggable databases there is quite a bit of discussion around creating and moving the pluggable database from one system to another (unplug/plug). The database creation assistant (dbca) is easily used to create a contain or pluggable database.  Normally after creating the database in a Linux environment, the DBA goes into sqlplus from the command line and logs into the database and does some validations. You will probably find that logging into sqlplus connect / as sysdba gets you connected to the container database. But how does one get to the pluggable database? And even a better question, is the pluggable database even available?
Each of the pluggable databases can be open and closed individually. Shutting down the container database will shutdown all of the pluggable databases, but startup of the container database doesn't mean all of the pluggable databases are started. To verify the pluggable database is open, log in to the container database through sqlplus and run the following:
SQLPLUS> select name,open_mode from v$pdbs;

NAME                           OPEN_MODE
------------------------------ ----------
PDB$SEED                       READ ONLY
PDBMM2                         MOUNTED

Notice that the PDBMM2 is only in MOUNTED state and not open. To then open the pluggable database run one of the following options:
SQLPLUS> alter pluggable database ALL open;
SQLPLUS> alter pluggable database PDBMM2 open;
SQLPLUS> alter pluggable database ALL EXCEPT PDBMM1;
Now that the database is open, connection to the pluggable database requires the same information we have needed to connect to any database, service name, port and host. If on the host the service name needs to be set or included with the login. The pluggable database connection is like logging into a normal database instance as in previous releases, so setting it as a ORACLE_SID or logging in with the service name included: username@PDBMM2.
The pluggable databases will be easily accessible through Enterprise Manager, and other tools like SQL Developer, but it is setup just like logging into a database instance in using the service and doesn't need the name of the container database, just the name of the pluggable database. From the server, logging in through SQLPLUS, this might be at first confusing if you are in the container database and trying to get to the pluggable database. It is a connect username@pdbmm2 that will get you there, or just setting the ORACLE_SID=PDBMM2 and then logging in through SQLPLUS.

Monday, October 1, 2012


Say it real fast and it sounds like Twiki of Buck Rogers. (At least that was part of the discussion with some other Oracle ACE Directors.) The other part of the discussion was that this can provide a great way to consolidate, patch/upgrade and maintain Oracle databases.

So what do these new acronyms mean? CDB - Container database. The container database is the global area for the database and contains the main system information. PDB - Pluggable databases. The pluggable database is the user/application information and has the user tables and system information about all of objects in the pluggable database. This is a key new feature of the Oracle 12c database.

Just start to think about what this can mean. It means I can have a few container databases (CDB) and multiple pluggable databases (PDB) in each container. I can backup and recover a PDB to a point in time, I can clone a PDB in seconds and I can plug a database into a patched CDB and have that PDB now on the patched version as well. The PDB is isolated to other PDBs and now there are security options for access to a CDB and different logins for PDBs to keep access separate. The current databases, previous versions are now non-CDBs. There are also non-CDBs available in 12c, that behave like the current database instances with schemas and shared system information. They are easy to manage in the database tools, like database creation database, Oracle Enterprise Manager and SQL Developer.
The rest of the week at OOW should provide more information about CDBs and PDBs. This is a nice new feature of Oracle 12c and provides an easy way to manage different applications in one CDB. Faster too! Another bonus.

Oracle Technology

Why is technology fun? It is always changing and providing new solutions and new innovations. If you want
have a simple career then technology is not for you. Especially DBAs we have new things happening all of the time. More data, big data, faster hardware!

Yes, OOW does speak loud and proud about the Oracle technology and things that they are doing well and how they have the best of breed in the technology stack. It does also give motivation to see how to look at things differently, provide value to the businesses.

As a DBA, some things get simpler, while there are other opportunities in our jobs to keep us challenged. Cloud offerings, Engineered Systems, better performance with software and hardware are a few things that make things simpler. DBAs have the opportunity to look at managing these engineered systems, working with cloud offerings and database as service, and even developing more in the role of a Database Machine Administrator (DMA).

There are still challenges of data, what data business needs, integration of data, securing data for the business. Is this an emerging role as well for the DBA? Do we need Big Data DBAs? What is coming out that is a new feature that are benefits and should be implemented. Even if things haven't worked in the past or seen as something important, is it now?

That might be one interesting thought here, that even with previous years at OOW not seeing cloud as important, but willing to come back and see it with a new set of eyes and how there are benefits there now, are ways we should be looking at our database environments. Take a new look, take advantage of new technology, maybe look at a direction that was rejected in the past that might be worth it now. Same with the role of the DBA, not just creating database, adding users but new tools and new opportunities.

OOW 2012 - Keynotes

Oracle Cloud and Engineered Systems mentioned already last night, and today is going to be a good day for the database. More details on the latest version of the database should be provided.

Even though the keynotes are a high level about Oracle products and the stack, they give a a good picture of what is currently important to the Oracle executives and direction that the Oracle products are heading. Get the big picture first and then follow up with sessions to dive into more details.

The other great opportunity is to network with the user community and see who is looking forward to implementing new features and what products have been game changers in their company.

Wednesday, January 25, 2012

Security DBA Responsiblity

"It would be easier to implement a patch policy if it came from management and we could get downtime", "I don't know what data to encrypt so the application owners need to tell me", "Isn't it the security team's responsibility to make sure that access changes when the jobs change". These are probably things we have heard DBAs says and maybe have even said them ourselves. Security is something that a DBA is responsible for implementing but where does the responsibility end?
I was reminded the other day about this topic and started to think about this. If there was an outage because something happened for an upgrade, and the DBA was not able to restore the database, is that the fault of the upgrade or not being able to restore? Again we can probably argue both, and at the same time I am thinking if I wasn't doing everything possible to assure that had good backups and could restore from them, I would fault myself.
Yes, it is definitely easier if it isn't just coming from the DBA team that security patches need to be applied, and if there were top down mandates to govern security practices, but what can be done by the DBA?
One, patches should be applied to the system and the DBA can have a solid implementation plan in place to make this easier. Probably if patching was successful on a regular basis the outage or maintenance window would be easier to get. A good test plan with reviewing of pre-checks and post-scripts to make sure everything gets covered in a step by step way. This process is something the DBA can own and promote. A well documented plan with details about testing and success rate.
Two, encryption at the tablespace level. This is transparent to the application, and if the DBA knows that the database can have sensitive information, then it is definitely worth raising the risk of what happens if this data at rest is accessed. Encrypting the tablespace means that you don't have to know exactly what fields are sensitive either. It would be great to have these conversations to control access and mask this data in test environments, but at least reducing the risk of data at rest and outside of the application is worth it. This also is a feature that is fun to implement as a DBA, because it is on the back end and is how you create the tablespace.
Three, protect yourself from seeing data you don't really want to see in the first place. Being able to prove out that even with sysdba permissions you can't see the data in a database vault realm protects you knowing what you really don't want to know. Using database vault does protect the data from the administrators, but still allows for the job to get done. Database vault is an option that can be configured after creating the database using dbca (database configuration assistant). The realms would be managed by someone outside of the DBAs or SYSDBA to make sure that these permissions cannot be granted back to the SYSDBA.
Four, check out the database firewall. We should be evaluating and looking at new security features. The firewall can help in the fight against SQL injection, and examining new features and doing the research would be useful to understanding if it is something that would be of value and reduce some risk in your environment.
Five, educate business and users of the environment. This would also tie in the research about features. After the initial understanding of what can be secured, monitored and implemented, then it is time to talk. Discuss what it takes to implement and the risk, which is a great way to look at some of the value you can get out of securing the environment.
These are good steps to be taking as a DBA to provide a secured database environment.
It is not necessarily going to be the easier road to take, but there are some things we can do. And there is persistence on our side because I would definitely like to error on the side of trying to implement the needed security and communicating the risks, then being caught when it isn't implemented and not having even tried.