Manage your clusters hadoop with your scripts ec2 on cloud Ikoula One

From EN Ikoula wiki
⧼vector-jumptonavigation⧽ ⧼vector-jumptosearch⧽

It often happens that you have to port the cluster administration EC2 scripts (and in particular the Hadoop clusters) or other when migrate to others Cloud.

CloudIkoulaONE is compatible with your existing ec2 scripts with small adjustments.

We assume that your scripts are in python 2.7 and use the boto3 library.


You need to install boto3 first :

pip install boto3


Once this has been done, go to your user directory and create a .aws directory:

cd ~

mkdir .aws


Inside this directory, you create two files: :

cd .aws

touch config

touch credentials


Your config file should contain this:

[default]

region = EU-FR-IKDC2-Z5-BASIC

endpoint_url = https://ec2.ikoula.com

ec2 =    signature_version = v2


Choose the region (also called zone on the CloudIkoulaONE) on which your cluster is deployed.

The credentials file contains your API access and secret key that you have previously retrieved from the CloudIkoulaONE interface.

[default]

aws_access_key_id = APIKEY CLOUD IKOULA ONE

aws_secret_access_key = SECRETKEY CLOUD IKOULA ONE


Since the boto3 environment is now configured to use the ec2 endpoint compatible with CloudIkoulaONE, you will be able to run your scripts.

However, you may need to slightly change the connection "string" :

conn = boto3.client('ec2', endpoint_url = endpoint)


The boto3 library will browse files in ~ /.aws and use it to connect to CloudIkoulaONE.


Afterwards, a cluster boot can be done in this way (provided you have an array containing the guid of the instances of your bigdata cluster):

for instance in instances_list:

print "Starting instance : " + instance

 conn.start_instances(InstanceIds = [instance])