User General user info job DIRAC

From Begrid Wiki
Revision as of 09:14, 9 June 2021 by Maintenance script (talk | contribs) (Created page with " == Installation and configuration of DIRAC Grid tools == If the DIRAC Grid tools are not installed in the machine to which you are logged in, run this set of commands :...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Installation and configuration of DIRAC Grid tools

If the DIRAC Grid tools are not installed in the machine to which you are logged in, run this set of commands :

$ mkdir dirac_ui
$ cd dirac_ui
$ wget -np -O dirac-install https://raw.githubusercontent.com/DIRACGrid/DIRAC/integration/Core/scripts/dirac-install.py
$ chmod u+x dirac-install
$ ./dirac-install -r v6r19p10 -i 27 -g v13r0 -e COMDIRAC

Now, to configure the DIRAC tools, issue these commands :

$ source  bashrc
$ dirac-proxy-init -x
$ dirac-configure -F -S EGI-Production -C dips://dirac-config.egi.eu:9135/Configuration/Server -I

After having entered the last command, you should see that the file etc/dirac.cfg has been adapted to contain the correct information.


Managing your credentials

These step suppose that you've got a valid personal grid premium certificate integrated in your browser. If it is not the case, or if the certificate is expired (that happens after one year), then please follow these instructions.

If the directory ~/.globus already exists and contains valid userkey.pem and usercert.pem files, then you can skip this step.

Downloading certificate from your browser

To get the certificate from the browser:

  • Firefox:
       Preferences -> Advanced -> View Certificates -> Select your certificate -> Backup
  • Explorer:
       Tools -> Internet Options ->Content -> Certificates -> Certificates ->Import/Export

No matter your browser, during the export process, you should be asked to provide a password to protect the .p12 content (it contains the private key of your personal certificate !)

As a result you will get the certificate as a file with .p12 extension.

Converting Certificates from P12 to PEM format

Source the dirac environment :

$ cd dirac_ui
$ source bashrc

Supposing the .p12 file you've got from previous step is in the current directory, issue the following command :

dirac-cert-convert.sh <USERCERT>.p12

where <USERCERT> is to be replaced by the name you've given to the .p12 file during the previous step.

During the execution of this command, you'll be asked twice to "Enter Import Password" : this is the password that protects your .p12 file (see previous step). You will also be asked to provide a passphrase to protect the private key (this is the "PEM passphrase"). This PEM passphrase will be requested each time you create a proxy : don't forget it !

Job management

In this section, we will illustrate the job management with DIRAC Grid commands on a very simple job.

First set up your environment so that dirac commands are available :

$ cd ~/dirac-client
$ source bashrc

When you start a new session, always check that you still have a valid proxy :

$ dirac-proxy-info

subject          : /DC=begrid/C=BE/O=VUB/CN=JOHN DOE jdoe@vub.be/CN=proxy
issuer           : /DC=begrid/C=BE/O=VUB/CN=JOHN DOE jdoe@vub.be
identity         : /DC=begrid/C=BE/O=VUB/CN=JOHN DOE jdoe@vub.be
timeleft         : 23:29:17
DIRAC group      : beapps_user
path             : /tmp/x509up_u20533
username         : jdoe
properties       : NormalUser

If you see that the timeleft is only a few hours left, then it's time to renew the proxy :

$ dirac-proxy-init -M -g beapps_user

Grid jobs are described using JDL (Job Description Language -> see this link for the JDL reference). Let's start with a very simple job, by creating a file "simplejob.jdl" with the following content :

JobName             = "simple_job";
Executable          = "/bin/ls";
Arguments           = "-ltr";
StdOutput           = "stdout";
StdError            = "stderr";
OutputSandbox       = {"stdout","stderr"};
Site                = "EGI.ULB.be";

In clear, summarizing a bit, this JDL means that we will run the command “/bin/ls -ltr”, and that the standard output of the command will be redirected to a file “stdout”, and the standard error messages, to a file “stderr”. Since these two files are mentioned in the OutputSandbox, it will be possible to retrieve them when the job is finished (we will see how to do the retrieval a bit later). The last line with the "Site" argument has been added to make sure that the job will be sent to BEgrid site (knwon as "EGI.ULB.be" in the DIRAC4EGI system), otherwise the job could be sent to any site supporting the beapps VO.

Let's submit this job :

$ dirac-wms-job-submit simple.jdl
JobID = 28959669

The output of the last command gives you the jobid, a sequence of digits that you'll need to keep in order to proceed with the rest of the management of your job.

Now you can check the status of the job from time to time :

$ dirac-wms-job-status 28959669
JobID=28959669 Status=Waiting; MinorStatus=Pilot Agent Submission; Site=EGI.ULB.be;

Once the job has reached a terminal status (DONE or FAILED), you can retrieve the files mentioned in the OutputSandbox attribute :

$ dirac-wms-job-get-output 28959669
Job output sandbox retrieved in /user/stgerard/dirac-job-9/28959669/

Job interacting with a Storage Element

In the world of Grid computing, a Storage Element (or SE) is a storage system that is equipped with a software interface (middleware) that allows authenticated and authorized grid users to interact with it. BEgrid has its own SE that is called "ULB-disk" in the DIRAC4EGI system.

A job can interact with a SE in two different very basic ways :

  • stage-in  : when starting, the job will download from the SE files (generally data) that are requited for its execution
  • stage-out : upon completion, the job will write to the SE that files that it has generated

As we will see in the next example, stage-in and stage-out are really simplified thanks to the elegant JDL syntax.

In this section, we will exercise on a job that will download its input data file from the BEgrid SE, run a computation on these data, and then write the result to the SE.

First, we will generate a data file containing a list of 10 random numbers :

$ (for i in {1..10}; do echo $(( ( RANDOM % 1000 )  + 1 )); done;) > rand_num_list.txt

The list has been written to a file "rand_num_list.txt" that we will have to copy to the SE with the following generic command :

dirac-wms-add-file <LFN> <local_file> <SE>

The argument <LFN> will be the logical path of the file in the DIRAC File Catalog (DFC). Each user has its own home directory in the DFC. To get the path of your own home directory, issue the following command :

$ dls
/beapps/user/s/stgerard:
...

So, a valid LFN path for the data file could be :

/beapps/user/s/stgerard/rand_num_list.txt

Let's upload the file to the SE :

$ dirac-dms-add-file /beapps/user/s/stgerard/rand_num_list.txt rand_num_list.txt ULB-disk

Now, we will write a script "sum.sh" that will add the numbers of the file provided as argument :

#!/bin/bash

awk '{s+=$1} END {print s}' $1

Make sure that this script is working :

$ chmod u+x sum.sh
$ ./sum.sh rand_num_list.txt

To submit this script, we need the following JDL (call it "sum.jdl") :

JobName       = "SumRandomNumbers";
Executable    = "sum.sh";
Arguments     = "rand_num_list.txt";
StdOutput     = "StdOut";
StdError      = "StdErr";
InputSandbox  = {"sum.sh","LFN:/beapps/user/s/stgerard/rand_num_list.txt"};
OutputSandbox = {"StdOut","StdErr"};
OutputSE      = "ULB-disk";
OutputData    = {"StdOut"};
Site          = "EGI.ULB.be";

In this JDL, the InputSandbox attribute is a list of 2 files : the first one is the Bash shell script that will be the engine of our job, the second one is the file containing the data that will be processed by the script and it is specified by its LFC path, so that the Grid machinery knows it has to stage-in this file in the job directory prior to the execution of the script. The OutputData attribute instructs that the file StdOut will be written to the SE specified by OutputSE.

It's now time to submit the job :

$ dirac-wms-job-submit sum.jdl
JobID = 28959670

and to check its status, repeat the following command :

$ dirac-wms-job-status 28959670
JobID=28959669 Status=Waiting; MinorStatus=Pilot Agent Submission; Site=EGI.ULB.be;

until "DONE" status has been reached, in which case, you can retrieve the OutputSandbox :

$ dirac-wms-job-get-output 28959670
Job output sandbox retrieved in /user/stgerard/dirac-job-9/28959670/

To check that the file "StdOut" has been written to the SE, run the following command :

$ dls -l
/beapps/user/s/stgerard:
29364
...

In the output, you should see a new directory whose name is beginning with the 5 first digits of the job id. Inside this directory, you'll find a directory named after the job id containing the file StdOut :

$ dls -l 29364/28959670/
/beapps/user/s/stgerard/29364/28959670:
-rwxrwxr-x 1 stgerard beapps_user 5 2018-07-20 11:53:02 StdOut

Data management with DIRAC Grid tools

TBD

Tutorials, training materials

In the attachments part of this page, you can download a self-contained tutorial in PDF format.

Sources

https://www.gridpp.ac.uk/wiki/Quick_Guide_to_Dirac

http://dirac.readthedocs.io/en/integration/UserGuide/Tutorials/ClientInstallation/index.html

https://github.com/DIRACGrid/DIRAC/wiki


Template:TracNotice