Android Studio keeps deleting my project files (.iml) after a gradle sync

I had this annoying issue where Android Studio kept deleting my top-level project file (.iml) whenever I do a gradle sync. For months I had no idea why and Google search doesn’t seem to show this as a common issue. Finally today I got to the bottom of this.

The issue was the casing of the file names. For the project file, I had mixed capital and lowercase, whereas the project folder name was all lowercase. This inconsistency was due to a move of the project from a windows environment to a mac where I had a different folder name.

The fix for this issue was simple: Close the project in Android Studio and delete all the .iml files. Then use File > Import project and select the project folder and this will regenerate all the .iml files. The new .iml files should now match the case of the folder.

PouchDB for Firefox Addon SDK

I started investigating a database for my Chrome and Firefox addon yesterday and came across PouchDB. PouchDB is basically a NoSQL database for the browser. It is inspired by CouchDB and can even sync to it. PouchDB has excellent cross browser compatibility uses different backends (WebSQL, IndexedDB, LevelDB for Node.js) depending on browser or js runtime capabilities. This makes it an excellent option for cross platform web applications.

CouchDB worked out of the box for Chrome extensions, however it did not play well with Firefox extensions, especially apps built with the Firefox Addon SDK (jetpack). PouchDB supports the CommonJS spec for loading modules as it supports Node.js so it shouldn’t be hard to fix this right? The main reason why it would not work was that vital components like indexedDB that are available on normal web pages aren’t available in the Firefox Addon window-less runtime context. If you use it within a normal loaded web page in the extension (i.e. resources://) it would work fine, but you would loose access to PouchDB when that web page is closed. You want to be able to access PouchDB in your main.js.

So the trick is to bring back all the missing components that PouchDB needs to run. Luckily, there wasn’t that many, so the fix was quite simple. At this time 3.3.0 was the latest version, and the following patch works fine. No guarantee for future versions:

Open up pouchdb-3.3.0.min.js and add the following lines right at the top before the comments:

if (typeof require !== 'undefined') {
  indexedDB = require('sdk/indexed-db').indexedDB;
  IDBKeyRange = require('sdk/indexed-db').IDBKeyRange;
  setTimeout = require('sdk/timers').setTimeout;
  clearTimeout = require('sdk/timers').clearTimeout;
  window = { btoa: require('sdk/base64').encode,
             atob: require('sdk/base64').decode,
             escape: require('sdk/querystring').escape };
  XMLHttpRequest = require("sdk/net/xhr").XMLHttpRequest;
}

That’s it, and you should be able to use require on your main.js like so (assuming you put pouchdb-3.3.0.min.js in the same lib directory):

var PouchDB = require('./pouchdb-3.3.0.min.js');

One last piece of advice is that webpages in your extension that are run under resources:// will not be able to access data from PouchDB/IndexedDB created in the main.js window-less context. They each have their own storage areas like they’re different websites. So you should pick one to use PouchDB and use messages to pass data across the contexts.

Docker in an LXC container on Gentoo

Docker is the newest craze in the devops world. It’s a tool that assists with application containerization using Linux Container technology. I decided to give it a try, but do it with a twist: I want to run docker inside a LXC container, essentially, run docker containers inside LXC containers. This inception style setup has a few benefits – It allows docker and its dependencies to be contained, isolated from the host machine. It also allows testing of different docker versions on different containers. It my case, I want to run docker under Ubuntu 14.04, without reformatting my entire Gentoo host.

Continue reading

DHT22 Temperature Sensor with Bleduino

ble

About a year ago I pledged $39 for the Bleduino kickstarter and despite it arriving on my mailbox a few months ago, only today did I have time to start playing around with it. I also got a DHT22 temperature sensor, which I will use with the device. A quick disclaimer, I’m a beginner in Arduino and electronics and would describe my soldering skills as non-existent. Hence you will find no soldering in this post.

Continue reading

Node.js fun with the same typescript module spread over several js files

I’m building a web application with Node.js and typescript for the first time and have discovered some “fun” quirks with how Node.js modules and typescript modules interact with each other. I use the term interact loosely as they don’t actually talk to each other, and in fact each do their own separate thing. In this post, I’ll offer some tips on how I untangled this.

Node.js modules are based on the CommonJS modules architecture, using require and exports. Typescript has language level support for modules, which are basically glorified namespaces. Typescript also has support for AMD, which is yet another module loading specification. Each of these patterns all create a very confusing Javascript ecosystem.

Tying together Typescript’s module system and Node.js requires a bit of glue code. First up you need to ensure that you’ve told typescript to allow the exports and require keywords. This can be easily done by going to this helpful library of typescript module definitions and download node.d.ts. Next up import the definition into your ts file by putting this line at the top. We put the node.d.ts in a sub folder called typescript-node-definitions.

///<reference path='typescript-node-definitions/node.d.ts'/>

After that exports and require should now be recognised by the Typescript compiler. You should be able to use nodejs modules by calling require inside your ts files. And you can export with ease by doing at the end of your ts file:

exports.HomeController = MyWebSite.HomeController;

You can then reference the compiled js file normally in your main file:

var MyWebSite = require('./HomeController.js');

Most likely, you’ll have several files/classes that need to be under the same namespace, so a helper method can be used to corral them under the same object namespace.

function requireall() {
	var cns = { };
	for (var i = 0; i < arguments.length; i++) {
		var ns = require(arguments[i]);
		for (var o in ns) {
			cns[o] = ns[o];
		}
	}

	return cns;
}

And replace require with

var MyWebSite = requireall('./HomeController.js', './AboutController.js');

Trouble occurs when you try to make one class per file. You have your module distributed off multiple ts files and one of them is a sub class which references a base class. You can add a reference to the base class:

///<reference path="BaseController.ts"/>

This satisfies tsc, but when you run the node you’ll get an error:

    __.prototype = b.prototype;
                    ^
TypeError: Cannot read property 'prototype' of undefined

That’s because node.js has no concept of typescript references. We need to use require to import the other file. Unfortunately, the module namespace we need to import into is recognised by the typescript as the module and any attempt to assign to it will result in a compiler error. This is where things need to get a bit hacky. We need to inject the base class into the namespace without triggering the compiler alarm. Underhandedly, we can use eval to achieve this. Put this under the module blabla { line and modify the __importClassName and __importModuleName variables:

var __importClassName = "HomeController";
var __importModuleName = "MyWebSite";
eval(__importModuleName + "." + __importClassName + " = require(\"./" + __importClassName + ".js\")." + __importClassName + ";");

This will allow node.js to resolve the base class and the whole thing to run!

501 5.5.4 Invalid Address when including Sender Name on Windows Server + PHP mail + IIS SMTP or MS Exchange

An interesting issue I came across the other day was that PHP was complaining giving a 501 5.5.4 Invalid Address error when trying to send email. The server uses IIS SMTP and the sender specified using the From: header ie
From: Me
Just specifying the email by itself works fine:
From: me@example.com

It turns out that this is a conflict between IIS SMTP and PHP. You need to specify a from email address separately by setting the ini configuration sendmail_from (for the MAIL FROM: command I presume) ie.

function mail2($from_address, $from_name, $to_address, $subject, $message, $headers) {
    $old_sender = ini_get('sendmail_from');
    ini_set('sendmail_from', $from_address);
    $headers = "From: " . $from_name . " <" . $from_address . ">\r\n" . trim($headers);
    mail($to_address, $subject, $message, $headers);
    ini_set('sendmail_from', $old_sender);
}

Presumably this is not an issue on UNIX as the external sendmail program handles delivery.

Javascript ArrayBuffer – Binary handling in javascript

As javascript and HTML5 venture into applications never contemplated, there was one basic feature sorely missing from its API line up. The feature I’m talking about is of course native support for binary. Currently, if you want to manipulate binary in javascript, you have to make do with an array of numbers to store each byte in the “byte array”. This was horribly ineffecient given that each byte of number type probably occupied 4 or 8 bytes, so you would be using 4 or 8 times the space. Another alternative is to use base64, but that meant your binary blob was 33% larger and was only good for serialisation and storage.

This changed however, with the advent of WebGL, the straw that broke the camels back. WebGL required efficient processing of byte arrays which herald the creation of the ArrayBuffer. Although designed for WebGL, the ArrayBuffer can be used anywhere in your javascript. WebGL is currently available for Firefox 4 and Chrome. The ArrayBuffer allows you to allocate a opaque chunk of memory. To manipulate the buffer, we have to “cast” or map the array to a Typed Array. There are various typed arrays such as Int16Array, Float32Array, but the one that interests us is probably Uint8Array, allowing us to view our ArrayBuffer as a byte array.

var buf = new ArrayBuffer(1024);
var bytes = new Uint8Array(buf);
for (var i = 0; i < bytes.length; i++) {
  bytes[i] = 0xFF;
}

The Typed Array may seem like a foreign concept in a dynamic-typed language such as javascript, but it’s neccesarry to provide good performance for binary handling. If you try to assign a Uint8Array element a number greater than 255, it will be truncated, and if you put a string, it’ll become 0, proving that this is more than just your average JS array.

Already there is talk of using these in HTML File API and Web Sockets API. WebSockets currently only support plain text UTF-8 frames, which means you can’t talk binary over the wire. Given that a lot of exisiting protocols are binary, this was a severely limitation. A web based proxy websockify that proxies native protocols into websocket frames needs the base64 each frame. Array Buffer support would allow us to do away the base64 overhead.

MythWeb and Flash streaming

For a while I’ve heard of this mythical flash streaming that is now supposedly built into MythWeb. However I have yet to see it anywhere in the website. What gives? So I decided to get to the bottom of this. There’s a wiki article on MythTV web which describe how it’s done, but it’s said to be outdated and pointed to MythWeb’s wiki page, which only mentions it’s been rewritten to enable Flash streaming. So how do I enable it?

After digging in source and finding various shenanigans with the WebFLV_on variable, the answer revealed itself in the preference pages of MythWeb (Settings > MythWeb > Video Playback). There is a tick box to “Enable Video playback”. However, it says it requires ffmpeg with mp3 support.

I’m using gentoo so installing ffmpeg was just a matter of emerge -av ffmpeg. After installing, I can finally tick the “Enable Video Playback”. However arriving at the preference pages and tried playing on the flash player, a new stumbling block appeared. It says that the pl/stream/bla/bla.flv is not found. Navigating to it manually revealed a 500 Internal Server Error.

Since I’m using Lighttpd, I discovered that it has a deficiency logging CGI errors. The error.log was useless and I ended up running Lighttpd in non-daemon mode (/usr/sbin/lighttpd -D -f /etc/lighttpd/lighttpd.conf) and looked at the errors spat out on to the console. Turns out that it requires Math::Round which I haven’t installed.

The story is actually a bit more cumbersome as before everything I needed to enable CGI on lighttpd for perl to work and to get around a streaming path issue, I modified $stream_url in /includes/defines.php to not double slash on my root, but I know everyone just wants to see what I wanted to see when I embarked on this journey – a screenshot of it in action:

It’s by no means perfect with video being low quality, lack of seeking and some high CPU usage – but it works!