Device credential for Cisco license lookup and transfer

https://tools.cisco.com/SWIFT/LicensingUI/LicenseAdminServlet/licenseLookup in CLI: 1 2 license save credential flash0:/credentials.lic more flash0:/credentials.lic Try SSH if you can’t get full strings from serial connection. ...

16 May, 2012 · Logan Han

ZeroMQ - PHP client & Python server with multi part message example

Using pyobj or php data serialisation is not an option in this case.. ...

2 May, 2012 · Logan Han

hubot pandora bot adapter

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 # Chat with pandora bot # # ai <anything> – PANDORA AI # QS = require “querystring” xml2js = require ‘xml2js’ module.exports = (robot) -> robot.respond /(ai|AI)( me)? (.*)/i, (msg) -> user = msg.message.user.name query = msg.match[3] botid = “meh” parser = new xml2js.Parser({explicitArray: true}) msg.http(“http://www.pandorabots.com/pandora/talk-xml”) .query({ botid: botid custid: user input: query }) .post() (err, resp, body) -> parser.parseString body, (err, result) -> #console.log(result.that) msg.send result.that

30 April, 2012 · Logan Han

cloudmailin hipchat hubot script

Took really long time to deduce workaround… :S Set cloudmailin as http://blah.herokuapp.com/hubot/cloudmailin/room_id ...

20 April, 2012 · Logan Han

Python - write compressed log file into HDFS for hadoop hive mapreduce

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 import pyhdfs from cStringIO import StringIO import binascii -snip- #Set hdfs connection info hdfsaddress = “namenode” hdfsport = 12345 hdfsfn = “filename” #gzip compression level clevel = 1 -snip- logger.info(“Writing compressed data into ” + hdfsfn + “.gz”) #open hdfs file fout = pyhdfs.open(hdfs, hdfsfn + “.gz”, “w”) #compress the data and store it in compressed_data buf = StringIO() f = gzip.GzipFile(mode=’wb’, compresslevel=clevel,fileobj=buf) try: f.write(concatlog) finally: f.close() compressed_data = buf.getvalue() #write compressed data into hdfs pyhdfs.write(hdfs,fout,compressed_data) #close hdfs file logger.info(“Writing task finished”) pyhdfs.close(hdfs,fout) -snip-

1 March, 2012 · Logan Han