Showing posts with label Outside of the Box Thinking. Show all posts
Showing posts with label Outside of the Box Thinking. Show all posts

Tuesday, February 12, 2013

Encrypting MaxL Passwords Using an Environment File


We all know the importance of using an environment file in our batch scripts right?  No?  Well I am not going to lecture you right now on that...  We are going to talk about how you can encrypt your MaxL passwords while still using an external environment file to store credentials and server information.

There is a ton of information out there on all these different Oracle blogs on how to properly encrypt your MaxL Scripts, but there is not too much out there on how to use this encryption while still utilizing an environment file.

For this example, we are going to walk through a series of files I created to demonstrate this functionality. Each file is prefixed with a step number and can be downloaded below if you would like to follow along.

Please note, in order to run these batch scripts included in this exercise, you need to have the Essbase Client installed on your local machine.  Some modifications to the batch scripts will need to be made for versions 11.1.2.2+.

http://www.scottdanesi.com/software/misc/ScottDanesi-PasswordEncryption.zip

Step 1: 01-CreateKeys.bat

This is a batch script that will automatically create a unique set encryption keys for your system.  Run this script as it is to create the file for the next step.

Example File Contents:
echo on
essmsh -gk >02-Keys.txt

Step 2: 02-Keys.txt

This text file is automatically generated by the batch file in the first step.  It will contain a unique set of keys to use for encryption and decryption of your passwords.

Example File Contents:
Essbase MaxL Shell - Release 11.1.1 (ESB11.1.1.3.0B126)
Copyright (c) 2000, 2009, Oracle and/or its affiliates.
All rights reserved.


Public Key for Encryption: 2257,1114744343
Private Key for Decryption: 52844593,1114744343

MaxL Shell completed

Step 3: 03-PasswordInfo.txt

This text file needs to be modified with the Public Key from the "02-Keys.txt" file generated in Step 2 along with the password that you want to encrypt.  This is technically a CSV file using the "pipe" as a delimiter.

Once you have modified this file, continue to the next step.

Example File Contents:
Password|18187,293621927

Step 4: 04-EncryptPassword.bat

For this step, you will run this batch script which will encrypt the password using the Public key that you entered into the file in Step 3.  This script will create a text file containing your freshly encrypted password.

Example File Contents:
FOR /F "tokens=1-2 delims=|" %%A IN (03-PasswordInfo.txt) DO (
SET NewPassword=%%A
SET PublicKey=%%B
)

essmsh -ep %NewPassword% %PublicKey% >05-EncryptedPassword.txt

Step 5: 05-EncryptedPassword.txt

This is the output file from Step 4 that contains your newly encrypted password.  You will need to reference this file when setting up the Environment File in the next step.

Example File Contents:
Essbase MaxL Shell - Release 11.1.1 (ESB11.1.1.3.0B126)

Copyright (c) 2000, 2009, Oracle and/or its affiliates.

All rights reserved.



Encrypted Data: 031017652655928821892075652309090891


MaxL Shell completed

Step 6: 06-EnvFile.txt

So here we are...  We now have everything that we need to populate the environment file.  This environment file is going to hold 4 peices of critical connection information for our test script.  For this example, the setting must be in the following order; username, encrypted password, server name, and private key from the 02-Keys.txt file.  Each of these values needs to be separated by the pipe | delimiter.  Take a look below for my example.

Example File Contents:
admin|031017652655928821892075652309090891|localhost|205053283,293621927


Step 7: 07-Test.bat

This is our main batch script for testing that our environment file is working properly.  It first reads from the environment file that we updated in Step 6 and sets the variables accordingly.  It will then run a MaxL script that I included that logs into Essbase and logs back out (08-Test.mxl).  This batch also creates the "09-TestMaxLOutput.txt" file so that you can verify that everything worked.

Example File Contents:
FOR /F "tokens=1-4 delims=|" %%A IN (06-EnvFile.txt) DO (
SET username=%%A
SET password=%%B
SET servername=%%C
SET key=%%D
)

echo on

essmsh -D 08-Test.mxl %key% %username% %password% %servername%

That is pretty much it!  So don't forget to download the zip file that contains all of this sample code and start experimenting for yourself.  Feel free to post any questions int he comments below and I will answer.

Thanks!

Wednesday, December 5, 2012

Innovative Way to Structure Essbase/Planning Account Dimension


Hey Everyone,
I wanted to share with you today a pretty innovative, and different  way to structure an account dimension in Essbase or Planning.  This method is not new and has been passed down from generation to generation from ancient times when planning could be installed from 3.5" floppy disks, slowly evolving into what I am going to describe below.  This will not be ideal to use in all situations, but when it fits, it fits very well.

I am going to cut to the chase and just get right into it.  Below you will find a sample structure of this method for use in a planning application and some notes below about why I personally like and dislike about this structure.


  • Account
    • Transferred_Accounts (~)
      • Loaded_Sales (Stored) (~)
      • Loaded_COGS (Stored) (~)
    • Input_Accounts (~)
      • Sales_Adjustment (Stored) (~)
      • COGS_Adjustment (Stored) (~)
    • Calculated_Accounts (~)
      • Calculated_Sales (Stored) (~)
      • Calculated_COGS (Stored) (~)
    • Reporting_Accounts (~)
      • Total_Earnings (Dynamic Calc) (+)
        • Sales (Dynamic Calc) (+)
          • Loaded_Sales (Shared) (+)
          • Total_Adjusted_Sales (Dynamic Calc) (+)
            • Calculated_Sales (Shared) (+)
            • Sales_Adjustment (Shared) (+)
        • COGS (Dynamic Calc) (-)
          • Loaded_COGS (Shared) (+)
          • Total_Adjusted_COGS (Dynamic Calc) (+)
            • Calculated_COGS (Shared) (+)
            • COGS_Adjustment (Shared) (+)

So obviously this example above is pretty simple, but what I am trying to show here is how you can bucketize most of the accounts in a flat list and use them as shared members in the Reporting Hierarchy.  With a flat list of Transferred, Input, and Calculated Accounts it makes processing and locating accounts much easier.

Something to note is that all members under the Reporting_Accounts parent, are either dynamic calc or shared.

I know this is going to get asked, so I will address it right now...  "But Scott, you are telling me that in the reporting account structure my Sales = Loaded_Sales + Calculated_Sales + Sales_Adjustment?  That is not right..."  The answer is like Schrödinger's cat, yes and no.  Yes, the outline physically aggregates these members.  No, because loaded data should be in the Actuals scenario and calculated/adjustment data should be in the Plan scenario.

Below is my list of Pro's and Con's for this method.

Pro's
  • Simplifies calc scripts.  For example, if you had a calc script that cleared all of the input accounts for a specific period, you could use @CHILDREN("Input_Accounts") within the fix statement.  This will avoid having to maintain the calc script if a new input account is added to the structure.
  • This works for data exports and transfers as well.  
  • This method makes locating accounts much easier as you can sort the flat lists of accounts by name in Essbase.
  • This method also simplifies security.  For example, you can apply security easily to all Input accounts in one place without having to pick through a large reporting hierarchy.  --Thanks MG!
Con's
  • This adds complexity for end-users since they need to be trained to only pull from the Reporting hierarchy at the bottom of the dimension.
  • Since this in a non-conventional way to structure your accounts, some administrators might think you are crazy.  There is a very small gap between crazy and innovative.

I would love to hear some feedback on this approach to see what some of you other experts think of this.  Also as always, let me know if you find any errors in what I am posting.  Thanks for reading!