Wednesday, December 19, 2012

Android Project Ant Build, and put the version in the release file

When you created one project using the eclipse android project, by default you can only build the project, export as APK through the eclipse menu.

Ant is another powerful build tool. you can convert the android project to a ant build target. then just run ant release to build the apk. here is one demo,
Give one project called FrameworkDemo
image

run android update project –p . , it will create one ant build.xml file
image

run ant release, it will create one apk under the bin file,
image

you can put the key information to the ant. properties, like
image

how ever, if you want the release file to be called projectname-youversion-release.apk, ( you can get the version from the AndroidManifest.xml

here is one quick fix. change build xml under platform/tools/ant
Add a target to extract the version informatino,
image
then reference this version
image
Now build again,
image
you can see the version in the file name.

Friday, December 14, 2012

Insert via batch_mutate using Aquiles, Delete, insert again - result is empty

Problem,

using C# Cassandra Client Aquiles to insert some data to cassandra, then delete it using the cassandra CLI. after that insert the same data back again, result is empty.

C# code,

using Apache.Cassandra;
using Aquiles.Cassandra10;
using Aquiles.Core.Cluster;
using Aquiles.Helpers;
using Aquiles.Helpers.Encoders;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using CassandraClient = Apache.Cassandra.Cassandra.Client;

namespace CSharpClient
{
    class Program
    {
        static void Main(string[] args)
        {
            new Program().PopulateDataWithBatchMutate();
            Console.WriteLine("Finished");

        }

        private const string CLUSTERNAME = "Test Cluster";
        private const string KEYSPACENAME = "exampleKeyspace";
        private const string COLUMNFAMILYNAME = "exampleCF";

        private void PopulateDataWithBatchMutate()
        {
            Dictionary<byte[], Dictionary<string, List<Mutation>>> mutation_map = new Dictionary<byte[], Dictionary<string, List<Mutation>>>();
            for (long i = 0; i < 1; i++)
            {
                byte[] key = ByteEncoderHelper.LongEncoder.ToByteArray(i);
                Dictionary<string, List<Mutation>> cfMutation = new Dictionary<string, List<Mutation>>();
                List<Mutation> mutationList = new List<Mutation>();
                for (long j = 0; j < 2; j++)
                {
                    string columnName = String.Format("Data-{0:0000000000}", j);
                    Mutation mutation = new Mutation()
                    {
                        Column_or_supercolumn = new ColumnOrSuperColumn()
                        {
                            Column = new Column()
                            {
                                Name = ByteEncoderHelper.UTF8Encoder.ToByteArray(columnName),
                                   Timestamp = UnixHelper.UnixTimestamp,
                                 Value = ByteEncoderHelper.LongEncoder.ToByteArray(j),
                            },
                        },
                    };
                    mutationList.Add(mutation);
                }
                cfMutation.Add(COLUMNFAMILYNAME, mutationList);
                mutation_map.Add(key, cfMutation);
            }

            ICluster cluster = AquilesHelper.RetrieveCluster(CLUSTERNAME);
            cluster.Execute(new ExecutionBlock(delegate(CassandraClient client)
            {
                client.batch_mutate(mutation_map, ConsistencyLevel.ONE);
                return null;
            }), KEYSPACENAME);
        }
    }
}

     1st insert,
        Data is there,
image

     then delete this row manually via cli, data is deleted, no problem so far.
image

then run the code again to do another insert for same data.( STILL NO DATA. that’s the problem.)
image

Why?
C# get incorrect timestamp, which is always older then the correct cassandra. so older insertion will always be ignored once there is one newer delete.

change the timestamp to the following code, will fix this issue.
image

Wednesday, December 12, 2012

How to, installation of Ganglia and setup Hadoop integration

Ganglia has three pieces, Gmond/Gmetad/Gweb, as you search ganglia using yum, you may see those 3 components.

image

Gmond is the monitoring Agent, collecting data and persist it (need to be deployed to every server that we monitor.)

So yum install ganglia-gmond.
 
in the /etc/gmond.conf, you can see the cluster name, change it to your cluster name, I will pickup the sameone like hadoop cluster.

also it has some configuration about the multi-cast ( basically, monitoring agent using multi-cast to make the data replicated on each node. so every node  could ansewer all the query request for each server in the same cluseter.) like the typically share-nothing cluster. each node also listens one tcp port.
that’s it. after that we can start the gmond daemon.

run gmond –t (show the current configuration.)

Gmetad daemon aggregates monitroing data from the clusters. and persit the data using the rrdtool.


so you may need 1+ nodes to do the HA. yum install ganglia-gmetad

for the gmetad configuration, just add the data_source, point to one of the Gmond node
then the config like wheter to store the rrdtool. by default it’s under /var/lib/ganglia/rrds
then gmetad will begin collect data. you can check the folder to see wheter data got collected.


the last one is the gweb, as the name implies, it’s a web interface for the end user to see the charts. basically it’s a standard PHP application using the php-gd module to generate some dynamic charts.

yum install ganglia-gweb will download all the php files. and it’s under /usr/share/ganglia by default.

chose you fav. application, I will use the httpd, and just copy the files to the html root ,for me will be /var/www/html

you need fix the conf.php mappting to /etc/ganglia/conf.php
image

you need tell the web where is the data located ( the data gmetad aggreateed) then that’s it.

then you can see http://webserver/ganglia to see the charts.

for the hadoop/hbase, just change the  hadoop-metrics.properties under the hadoop conf folder.

point the class to gangliacontext and remember to setup the server to the muliticast ip instead of localhost , anotherwise, no data will be collected if you use the default gmond mulit-cast mode

image

then you can see it in the ganglia console, like the metrics of the hadoop

image

 
Locations of visitors to this page