Thursday, October 10, 2013

Proof in the Privileges

Asking for full permissions on the database, tables or schemas may have been an easy way to do upgrades or run other database applications, but it provide a secure environment and it definitely doesn't get to the level of least privilege. Least privilege is a way to minimize access to sensitive data and granting only the permissions needed is a key step for database security.
Knowing what permissions have been granted is easy to find out by querying the catalog tables and auditing changes in grants. With Privilege Analysis we can also verify what permissions are actually being used. This analysis can provide roles that only have the permissions used or scripts to revoke the used permissions. Nice to have proof that only an SELECT or EXECUTE on a few objects is required instead of DBA role.
So how does it work? The DBMS_PRIVILEGE_CAPTURE is used to gather the information around what permissions are used and generate the results. The tables DBA_USED_PRIVS and DBA_UNUSED_PRIVS are some of the tables that contain the information from the capture.

BEGIN
DBMS_PRIVILEGE_CAPTURE.CREATE_CAPTURE (NAME => 'dba_capture_all_privs', DESCRIPTION => 'privilege_analysis_example_for_all_users', TYPE => DBMS_PRIVILEGE_CAPTURE.G_DATABASE);
END;

BEGIN
DBMS_PRIVILEGE_CAPTURE.GENERATE_RESULT (NAME => 'dba_capture_all_privs');
END;

There are options that will capture different privileges for the database, roles and context: G_DATABASE, G_ROLE, G_CONTEXT and G_ROLE_AND_CONTEXT.
Creating scripts to create a role or revoke privileges can be done with a query against the DBA_USED_OBJPRIVS or DBA_UNUSED_OBJPRIVS tables.

It is great to have proof of what privileges are being used and being able to do the analysis to help secure the environment.

There are several layers of security that are important for the environment and to get more information about Oracle Database security a few of us have written an ebook, Securing Oracle Database 12c: A Technical Primer, with references and with 12c information. It is available free for a limited time by registering at  www.mhprofessional.com/dbsec and use code: db12c. There are also a few more examples on Privilege Analysis. Be sure to check it out as there some excellent examples on auditing, encryption, handling privileged users and many others.


Monday, August 5, 2013

Pluggable databases - logging in

User group technology days and meetings are not only great for the presentations but for the hallway conversations. In a short discussion in the hall, I realized I had forgotten to mention something in my presentation the other day about pluggable databases, an Oracle database 12c new feature, and didn't even realize until having this conversation that it would be of interest to others. It is something that is fairly simple once you figure it out, but can cause a few minutes of distress or a couple of hours of doubt of how one is even a DBA and surviving.
With the pluggable databases there is quite a bit of discussion around creating and moving the pluggable database from one system to another (unplug/plug). The database creation assistant (dbca) is easily used to create a contain or pluggable database.  Normally after creating the database in a Linux environment, the DBA goes into sqlplus from the command line and logs into the database and does some validations. You will probably find that logging into sqlplus connect / as sysdba gets you connected to the container database. But how does one get to the pluggable database? And even a better question, is the pluggable database even available?
Each of the pluggable databases can be open and closed individually. Shutting down the container database will shutdown all of the pluggable databases, but startup of the container database doesn't mean all of the pluggable databases are started. To verify the pluggable database is open, log in to the container database through sqlplus and run the following:
SQLPLUS> select name,open_mode from v$pdbs;

NAME                           OPEN_MODE
------------------------------ ----------
PDB$SEED                       READ ONLY
PDBMM2                         MOUNTED

Notice that the PDBMM2 is only in MOUNTED state and not open. To then open the pluggable database run one of the following options:
SQLPLUS> alter pluggable database ALL open;
SQLPLUS> alter pluggable database PDBMM2 open;
SQLPLUS> alter pluggable database ALL EXCEPT PDBMM1;
Now that the database is open, connection to the pluggable database requires the same information we have needed to connect to any database, service name, port and host. If on the host the service name needs to be set or included with the login. The pluggable database connection is like logging into a normal database instance as in previous releases, so setting it as a ORACLE_SID or logging in with the service name included: username@PDBMM2.
The pluggable databases will be easily accessible through Enterprise Manager, and other tools like SQL Developer, but it is setup just like logging into a database instance in using the service and doesn't need the name of the container database, just the name of the pluggable database. From the server, logging in through SQLPLUS, this might be at first confusing if you are in the container database and trying to get to the pluggable database. It is a connect username@pdbmm2 that will get you there, or just setting the ORACLE_SID=PDBMM2 and then logging in through SQLPLUS.

Monday, October 1, 2012

CDB-PDB-PDB-PDB

Say it real fast and it sounds like Twiki of Buck Rogers. (At least that was part of the discussion with some other Oracle ACE Directors.) The other part of the discussion was that this can provide a great way to consolidate, patch/upgrade and maintain Oracle databases.

So what do these new acronyms mean? CDB - Container database. The container database is the global area for the database and contains the main system information. PDB - Pluggable databases. The pluggable database is the user/application information and has the user tables and system information about all of objects in the pluggable database. This is a key new feature of the Oracle 12c database.

Just start to think about what this can mean. It means I can have a few container databases (CDB) and multiple pluggable databases (PDB) in each container. I can backup and recover a PDB to a point in time, I can clone a PDB in seconds and I can plug a database into a patched CDB and have that PDB now on the patched version as well. The PDB is isolated to other PDBs and now there are security options for access to a CDB and different logins for PDBs to keep access separate. The current databases, previous versions are now non-CDBs. There are also non-CDBs available in 12c, that behave like the current database instances with schemas and shared system information. They are easy to manage in the database tools, like database creation database, Oracle Enterprise Manager and SQL Developer.
The rest of the week at OOW should provide more information about CDBs and PDBs. This is a nice new feature of Oracle 12c and provides an easy way to manage different applications in one CDB. Faster too! Another bonus.

Oracle Technology

Why is technology fun? It is always changing and providing new solutions and new innovations. If you want
have a simple career then technology is not for you. Especially DBAs we have new things happening all of the time. More data, big data, faster hardware!

Yes, OOW does speak loud and proud about the Oracle technology and things that they are doing well and how they have the best of breed in the technology stack. It does also give motivation to see how to look at things differently, provide value to the businesses.

As a DBA, some things get simpler, while there are other opportunities in our jobs to keep us challenged. Cloud offerings, Engineered Systems, better performance with software and hardware are a few things that make things simpler. DBAs have the opportunity to look at managing these engineered systems, working with cloud offerings and database as service, and even developing more in the role of a Database Machine Administrator (DMA).

There are still challenges of data, what data business needs, integration of data, securing data for the business. Is this an emerging role as well for the DBA? Do we need Big Data DBAs? What is coming out that is a new feature that are benefits and should be implemented. Even if things haven't worked in the past or seen as something important, is it now?

That might be one interesting thought here, that even with previous years at OOW not seeing cloud as important, but willing to come back and see it with a new set of eyes and how there are benefits there now, are ways we should be looking at our database environments. Take a new look, take advantage of new technology, maybe look at a direction that was rejected in the past that might be worth it now. Same with the role of the DBA, not just creating database, adding users but new tools and new opportunities.


OOW 2012 - Keynotes

Oracle Cloud and Engineered Systems mentioned already last night, and today is going to be a good day for the database. More details on the latest version of the database should be provided.

Even though the keynotes are a high level about Oracle products and the stack, they give a a good picture of what is currently important to the Oracle executives and direction that the Oracle products are heading. Get the big picture first and then follow up with sessions to dive into more details.

The other great opportunity is to network with the user community and see who is looking forward to implementing new features and what products have been game changers in their company.

Wednesday, January 25, 2012

Security DBA Responsiblity

"It would be easier to implement a patch policy if it came from management and we could get downtime", "I don't know what data to encrypt so the application owners need to tell me", "Isn't it the security team's responsibility to make sure that access changes when the jobs change". These are probably things we have heard DBAs says and maybe have even said them ourselves. Security is something that a DBA is responsible for implementing but where does the responsibility end?
I was reminded the other day about this topic and started to think about this. If there was an outage because something happened for an upgrade, and the DBA was not able to restore the database, is that the fault of the upgrade or not being able to restore? Again we can probably argue both, and at the same time I am thinking if I wasn't doing everything possible to assure that had good backups and could restore from them, I would fault myself.
Yes, it is definitely easier if it isn't just coming from the DBA team that security patches need to be applied, and if there were top down mandates to govern security practices, but what can be done by the DBA?
One, patches should be applied to the system and the DBA can have a solid implementation plan in place to make this easier. Probably if patching was successful on a regular basis the outage or maintenance window would be easier to get. A good test plan with reviewing of pre-checks and post-scripts to make sure everything gets covered in a step by step way. This process is something the DBA can own and promote. A well documented plan with details about testing and success rate.
Two, encryption at the tablespace level. This is transparent to the application, and if the DBA knows that the database can have sensitive information, then it is definitely worth raising the risk of what happens if this data at rest is accessed. Encrypting the tablespace means that you don't have to know exactly what fields are sensitive either. It would be great to have these conversations to control access and mask this data in test environments, but at least reducing the risk of data at rest and outside of the application is worth it. This also is a feature that is fun to implement as a DBA, because it is on the back end and is how you create the tablespace.
Three, protect yourself from seeing data you don't really want to see in the first place. Being able to prove out that even with sysdba permissions you can't see the data in a database vault realm protects you knowing what you really don't want to know. Using database vault does protect the data from the administrators, but still allows for the job to get done. Database vault is an option that can be configured after creating the database using dbca (database configuration assistant). The realms would be managed by someone outside of the DBAs or SYSDBA to make sure that these permissions cannot be granted back to the SYSDBA.
Four, check out the database firewall. We should be evaluating and looking at new security features. The firewall can help in the fight against SQL injection, and examining new features and doing the research would be useful to understanding if it is something that would be of value and reduce some risk in your environment.
Five, educate business and users of the environment. This would also tie in the research about features. After the initial understanding of what can be secured, monitored and implemented, then it is time to talk. Discuss what it takes to implement and the risk, which is a great way to look at some of the value you can get out of securing the environment.
These are good steps to be taking as a DBA to provide a secured database environment.
It is not necessarily going to be the easier road to take, but there are some things we can do. And there is persistence on our side because I would definitely like to error on the side of trying to implement the needed security and communicating the risks, then being caught when it isn't implemented and not having even tried.

Friday, February 4, 2011

Get the information fast

I was reminded today, it is good to be a DBA. A DBA can mean so many things and have many different roles. It definitely keeps my life interesting.
Data can be very helpful to the business and provide important information to make decisions, execute transactions and keep things moving. The problem for the DBA is to keep things moving. Existing system can be monitored to check if queries are executing efficiently or if there is anything bogging the system down. One day things might be running great and the next day it only one query is barely getting through. What happened? Things might have changed, a batch job could have run long. Are backups still running? Statistics? If applications are running slow, the question comes back what is wrong with the database.
At this point if you are a pro-active DBA you have a question back! is this a normal process or is this something new? It seems like we are pulling more data, new data loaded? Thank goodness for tools and those monitoring scripts that are in place. How can I tell this point can you quantifiy the problem.? Only if you have some benchmarks, you can then tell if there is more data, how much slower things might be running. Simple benchmarks on basic application queries, backup times, how long to gather statics can help with how slow or fast things are running. Space benchmarks, object counts and object changes also provide good benchmarks for the system. If gathering this information, the benchmarks are there for changes. Because I have to ask if you make improvements and can't tell anyone how much you improved, what fun is that!
Another nice things about proactively monitoring the performance, when there are issues, you already have quick information and can start looking into other things for the problem. Monitoring the performance is something you are continuously doing, because things do change.
Systems van be designed and configured for performance. The initial build and implementation should take performance into consideration.
I am looking forward to an upcoming IOUG Training Day that addresses this topic of real world performance and archectiting database systems to get the information fast. Real World Performance