# IT Blog

### Just some IT stuff

So you want to create batch? Great, you probably already know that batch can process much more records then common apex script, as it is not subject to normal governor limits. Number of records returned by Database.QueryLocator .

Here is a general template for batch class:

global class AwesomeBatch implements Database.Batchable<sObject> {

String query = "SELECT something FROM somewhere WHERE somefield = somevalue";

global Database.QueryLocator start(Database.BatchableContext BC){
return Database.getQueryLocator(query);
}

global void execute(Database.BatchableContext BC, List<sObject> scope){
for(sobject s : scope){
s.put(Field,Value);
}
update scope;
}

global void finish(Database.BatchableContext BC){
//send email with batch result?
}
}

Just remember to include start, execute and finish methods.
And here is how to run the batch from Apex script:

BatchClass bc = new BatchClass ();
Id bcId = Database.executeBatch(bc, 50);
System.debug(bcId);

This will put the batch into run queue and print out job Id, witch is useful for checking batch status using SOQL:

SELECT Id, Status, JobItemsProcessed, TotalJobItems, NumberOfErrors
FROM AsyncApexJob WHERE ID = 'bcId'

Some things to remember:
• you don't run batch, you add it to the queue, there is no guarantee when it will actually be run
• scope parameter (50 in this case) defines how many records from set returned by QueryLocator will be processed by each execution, so if query returns 187 rows and scope is set to 50, there will be 4 runs: 50, 50, 50, 37
• it is common practice to schedule batches using class implementing schedulable interface
• along with Database.Batchable, you can implement Stateful interface, it will allow you to hold properties' values between execute method runs

Person Account is a special type of account (very much like separate record type). It was added later on by salesforce, when they realized that customers keep not only other business information in the SF, but regular people as well.

There are some considerations to be aware of, so turning on Person Account in your organization requires contact with Salesforce support.

How to select only Person Accounts? Simple:

SELECT Id, Name
FROM Account
WHERE RecordType.IsPersonType = true

Hi devs!

Here is how to create DB transaction and roll it back if an exception is thrown:

Savepoint sp = Database.setSavepoint();
try {

insert account1;

update customObject;

}
catch (Exception e) {
Database.rollback(sp);
}

The benefit is that if the insert of Account is successful but update (ie with new account ID) fails, the whole transaction is rolled back, so there is no trace of account1 in the database.

Ok, so let's do something productive with the HP Operations Orchestration.
I'm assuming you have installed at least the OO Studio and you are able to run it.
Our first flow will not do anything spectacular like running Disaster Recovery Procedure, as it will only copy a file on local machine, but if you grasp the concept, there will be almost no limits to what you can achieve (sounds hollywoody, but hey - why not :) ).

So, let's get started.

1. Open your OO Studio (obviously!)
2. Create a new project using this big green plus sign:
3. Type in some project name, I'm not that creative so I will use "Copy File". It's worth noticing where the project is created, as this is also the place where Studio will create packages to deploy on Central.
4. OO Creates basic folder layout for you, you will see that there is Library folder and Configuration folder. Let's leave Configuration for now and create a subfolder in Library. Note that you cannot create a flow directly in Library, there needs to be at least one subfolder. I will name ma subfolder "CopyFile" (again with this creativity!).
5. Now the magic begins :) Let's create first flow, so right-click the CopyFile subfolder and choose New -> Flow. You can name it as you wish, for the sake of not being boring, I will use "CopyFile".

6. Now you will notice that the newly created flow is underlined in red. This is because it's empty (duh!) thus it does not contain necessary flow parts, like starting step and results. Let's deal with that.
7. We want copy some file so let's look for a built-in step to do so. All built-in steps are listed in Dependencies section just below your project folder structure. You can use search to find what you are looking for or just browse manually.
8. Driven by some experience, I will go to Operations -> File System -> Windows Only and look for FS Copy step.
9. Let's drag the FS Copy step to the canvas. It is quite important to note some elements of this step, as there are some basic characteristics and rules true to any flow you will create.
• the step has thick green frame - this means it is first step (starting point) of the flow - so we satisfied one of the generic flow requirements
• go ahead and select the step, then go to Inspector section below - the step has Inputs, Results and Description (and some other things I will leave out for now), I urge you to read the Description part as it is mini-documentation on this exact step, almost any step has one. Also note which inputs are required and witch are not.
10. So we have a flow with one step. It still is not a valid one, because OO does not know what to do if the step succeeds or fails. Our flow is as simple as it gets, so it will end in either way, but I'd like to clearly see it in red if it fails and green if it went OK. To do so, let's go to Step Palette at the top of the canvas and drag Resolved (green tick) and Error (red cross) result steps.

Note: Placing of these is quite arbitrary, but it is considered a good practice to make transitions between steps non-intersecting.
11. Did you notice that FS Copy step has similar icons on it? That is because we will connect them accordingly. It's easy as that. Just drag success to Resolved: success and failure to Error: failure. Arrows that connect the steps are called transitions. At the end, you should have something like that:
12. Now let's save the flow (Ctrl-S, as you would expect). The red underline in your Library should disappear, as this is a fully functional, valid OO flow! Congrats! Let's test it.
13. In OO Studio, to run the flow you need to click "debug" icon. Studio generally only debugs flow, in production they should be run from the Central.
14. In debug mode, you will see the big green Play button - let's use it to run the flow. This will cause a form to pop-up, letting you to populate the input fields. Note that required fields are marked with an asterisk (*). I will ignore all not mandatory fields for now.
15. I have created a text file: C:\Tmp\text.txt that we will be copying - go ahead and create one for yourself. So here what I will type in the form:
• Source FileName: C:\Tmp\text.txt
• Destination: C:\Tmp\text2.txt
17. After a few seconds the flow should complete with step Resolved: success:

18. You can verify that C:\Tmp\text2.txt file was created. And for some reason it is identical to the text.txt!

That's it! You are now officially HP Operations Orchestration 10.10 Developer!

HP Operations Orchestration 10 is pretty complex and versatile software. This means at first glance it may be not at obvious how it works and what parts are involved in the suite.

Over the version change 9.x -> 10.x there were some major changes in how HP OO is build and how flows are published, by let's concentrate on the 10.x version.

So here is a pretty general description you may find useful.

The Central

First of all there is HP OO Central, installed on Windows or Linux box. This is the main part, the engine that handles tasks and configuration. It's generally a Java web app with database backend (you can use MySQL, PostgreSQL, MS SQL Server or Oracle, depending on your preference and size of the deployment). HP OO Central handles some basic features:

• Access control (authentication, roles, etc)
• Scheduler
• Flow deployment
• Running flows
• Logs of previous runs
• Pretty nice WebUI for whole system

You can obviously install HP OO Central separately form it's DB. Central also can be installed in High Availability setup, which is quite simple - just install another Central and point it to the same DB. All the magic behind is done automatically.

The Workers

If you feel that there is more to be done that Central can handle, or you need to run flows in remote locations but don't want to or need to install more centrals, you can install a stand-alone running engine called Worker. Basically, it's quite small service/daemon that pulls jobs from Central and runs them by itself. Worker and Central are connected via encrypted SSL connection.

I think it's worth mentioning that in previous versions (9.x and below), Worker was called RAS - Remote Action Server. It may be a little bit confusing, because in docs in v10.x, sometimes it is still called RAS. RAS was developed a little bit differently - Central pushed the jobs to RAS, now it's reversed. This is quite an improvement for network/security guys and whole firewall business.

The Studio

HP OO Studio is where the developers put flows together. It's a graphical tool (Java-based), that can be installed only on Windows, and 64-bit since version 10.10.

Basic life cycle for OO flows is: create a flow in Studio, debug it, pack it up in a package and deploy the package on Central.

HP Operation Orchestration is one of the world leading IT automation suite. In my opinion, after about one year of developing IT workflows, I would say it's probably most complete one.

HP, probably learning from Microsoft and Oracle Express editions, recently started to share their top software for free usage, under Community Edition brand. This includes HP CloudOS, HP LoadRunner and HP Operation Orchestration.

Of course, HP OO Community Edition is subject to some restrictions, but still is usable and very solid choice for IT process automation.

HP OO CE is stripped of some content packs available in full version. Also there are some licensing restrictions:

• You may not use software to provide services to third parties.
• You may not distribute, resell, share or sublicense software to third parties.
• You may not copy the Software or make it available on a public or external distributed network.
• You may not allow access on an Intranet unless it is restricted to authorized users.
• You may copy the Software for archival purposes or when it is an essential step in authorized use so long as You retain any product identification, trademark, copyright or other notices in the Software.
• You may not modify, reverse engineer, disassemble, decrypt, decompile or make derivative works of the Software. If you have a right to do so under law, you must first inform HP in writing about such modifications.
• You may not disclose to any third party performance information or analysis (including, without limitation, benchmarks and performance tests) from any source relating to the Software;
• You may not use the Software not in compliance with the authorizations and restrictions for the specific Software found at www.hp.com/go/SWLicensing.

So base line is: you can use it "as is", for internal company's processes. In  my opinion - it's good enough for some basic stuff.

While maintaining MS SQL Server databases used by external applications, you probably needed at some point to execute an action, that was blocked by one or more existing connection.
This is how you can kill all open connections to specific DB:
USE master;
GO
ALTER DATABASE dbName
SET SINGLE_USER
WITH ROLLBACK IMMEDIATE;
GO
ALTER DATABASE dbName
SET MULTI_USER;
GO
It's probably not the prettiest method, but it gets things done and fast. What it does is switching DB to a sigle-user mode, so all the existing connections are dropped.
"dbName" is obviously a name of database, that you need freed.

This is how you can check size of all tables in MS SQL Server database:

DECLARE @cmd VARCHAR(max)

DECLARE curs CURSOR FOR
SELECT table_catalog + '.' + table_schema + '.' + table_name FROM
information_schema.tables
OPEN curs
FETCH NEXT FROM curs INTO @cmd
WHILE @@fetch_status = 0
BEGIN
EXECUTE sp_spaceused @cmd
FETCH NEXT FROM from curs into @cmd
END
CLOSE curs
DEALLOCATE curs

The way it works is:

1. create cursor based on list of all tables
2. for each entry in the list execute dynamic query using sp_spaceused stored procedure