mount fuse drive with ssh command, very useful for s3

sudo sshfs -o allow_other,defer_permissions,IdentityFile=~/.ssh/id_rsa [email protected]:/ /mnt/droplet
0

Linux speed test and benchmarking commands are tools I use are these

For cpu:

sysbench --test=cpu --cpu-max-prime=20000 run

For network:

wget -O /dev/null http://speedtest.dal01.softlayer.com/downloads/test100.zip
wget -O /dev/null http://speedtest.sea01.softlayer.com/downloads/test100.zip
wget -O /dev/null http://speedtest.ams01.softlayer.com/downloads/test500.zip

For Disk write speed:

wget http://speedtest.ams01.softlayer.com/downloads/test500.zip
0

Docker commands for elasticsearch cluster,are here

————————START MAIN————————–
————————REFERENCE COMMANDS——————
docker run -d –name es0 -p 9200:9200 es
docker run -d –name es1 –link es0 -e UNICAST_HOSTS=es0 es
docker run -d –name es2 –link es0 -e UNICAST_HOSTS=es0 es
————————REFERENCE COMMANDS——————Continue reading

0

Yes The Cloud shivalink.com is ready and waiting for first signup, Current capacity 26 User per 5GB, with nginx and owncloud. Hosting capacity upto 40 users with SSD and Link speed of 1Gbps Out speed and 5Gbps Incoming unmetered Bandwidth… 🙂 . It has mail server, apache with CGI for all kinds of web hosting, Proxy Server, Torrent Server blabla. Enjoyyyy

0

Crazy day, I indexed 30GB file having 53 million lines of json data to elastic. Then I tried kibana with it it was really enjoyable after doing it with my drink. Link to kibana is shivalink.com:5601.

Link to exastic is shivalink.com:9200

the most tough was to unzip 5GB file using all cores, it was bz2 file. I used pbzip2 but it didn’t worked in my case. Then I found lbzip2 -d myfile.json. It was really fast and used my all cores efficiently. It turned out to be 30GB then. After that how could we insert it to elastic, as I am very new to this I found esbulk and started with this. I inserted 45 million entries then It became too slow. Now I had no option other and stopping it right there.

Than I came up with new idea of tail -n No of rest of the entries and inserted them back. I successfully did it. Now I can say I kind of know big big data….. 🙂 feeling happy

0