Android Chat Head Library

Chat head is new feature which floats on screen instead of residing inside conventional application. This feature is very continent for multitasking as user can work and chat at the same time.
Like you are using calculator and someone message you on Facebook then this chat is shown over calculator like a floating bubble and you can reply to this chat by just by clicking the chat head and then resume your work after ward .

So no app switching !!!
Here i introduced Chat Head Android Library called Bubble for Android.

How to use

Configuring your project dependencies

Add the library dependency in your build.gradle file.
dependencies {
    compile 'com.txusballesteros:bubbles:1.2.1'


Adding your first Bubble

Compose your Bubble layout, for example using a Xml layout file. Remember that the first view of your Bubble layout has to be a BubbleLayout view.



Create your BubblesManager instance.

private BubblesManager bubblesManager;

protected void onCreate(Bundle savedInstanceState) {
     bubblesManager = new BubblesManager.Builder(this)

protected void onDestroy() {

Attach your Bubble to the window.

BubbleLayout bubbleView = (BubbleLayout)LayoutInflater
                                    .from(MainActivity.this).inflate(R.layout.bubble_layout, null);
bubblesManager.addBubble(bubbleView, 60, 20);

Configuring your Bubbles Trash

If you want to have a trash to remove on screen bubbles, you can configure the layout of that.

Define your trash layout Xml.

    android:layout_gravity="bottom|center_horizontal" />

Configure the trash layout with your BubblesManager builder.

private BubblesManager bubblesManager;

protected void onCreate(Bundle savedInstanceState) {
     bubblesManager = new BubblesManager.Builder(this)

Thats it. Hope it helpful isn't it. Please give me some second by your comment.

Google Now Tips & Tricks You Should Know

Apple have Siri, Microsoft have Cortana and Google have Google Now. Which one is better is a discussion that never ends. After using all of them I personally think that Google Now is slightly better than the competition. And just to be clear, I’m not saying this because I’m an Android fanboy. My opinion is completely impartial.

Google Now, that started as a way to simply search Google, has evolved a lot over the years and has become much more than just a way to look stuff up. It is a full fledged personal voice assistant now. Being accessible from any screen adds another level in its functionality.

Setting Up Google Now

Before diving into the voice commands it is important that Google Now is setup according to your tastes and preferences. If you’ve already set it up you deserve a high five. But if you haven’t, here’s you to do it:
  1. Open Google app and sign in using your Google account
  2. Done signing up? Great. Now tap on the hamburger menu on the top left corner of the screen.
  3. Tap on “Settings” to manage stuffs like accounts & privacy, search language, etc. Make sure to enable voice detection from any screen by going to Voice > “OK Google” detection > From any screen > turn on the toggle.
  4. Go to “Customize” to set your preferences
  5. Once everything is done you’ll start to see relevant information in the form of Now Cards
  6. Saying “Okay Google” from any screen will activate Google Now
Once you get used to Google Now it’ll become difficult not to use it everyday. Here are 10 Google Now voice commands to get the most out of this amazing service.

1. Do Calculations & Conversions

Everyone is not a mathematics wizard. Getting the calculations done without thinking too hard is a quite amazing feeling.
Google Now can do your calculations for you. First of all say “Okay Google” to fire up Google Now. Once it is listening say “add 678548 with 224689”. You’ll get the answer in a second or two. You can use Google Now for other calculations like subtraction, division, or multiplication. Say ” 17 % of 4578″, it can do that too.
Google Now is also useful when it comes to conversions. Say “Convert 46.5 kilometres in meter”, you’ll get the accurate answer.
You can also use it for calculating tip or finding out square root.

2. Check The Weather

If you want to know how the weather is going to be at your place or any other place, Google Now can do that too. Just ask ” What’s the weather like? “ to know weather condition of your current location. “Will it rain today?” or “Will I need an umbrella tomorrow?” would also do the trick.
If you want to know what’s the weather like in some other place simply ask “What’s the weather like in New York?”. Replace New York with the place of your liking.

3. Set Alarms & Timers

Google Now can set alarms and timers too. Just say ” OK Google “ to launch Google Now, then say “Set a timer for 20 minutes” and it will notify you after 20 minutes.
To set alarms, say “Wake me up at 5:45 AM”. It will set an alarm for the mentioned time.

4. Define Words

I use this one quite often. If you are confused what is the meaning of a word just ask Google Now. Simply say ” What is the meaning of uncanny?” or say “Define uncanny”. Google Now will tell you what that word means.

5. Create Events & Reminders

Creating Events & Reminders with Google Now is as simple as playing Angry Birds.
Just say ” Set a meeting with John Doe at 3:45 PM” or “Remind me to buy coffee when I’m near City Centre”.

6. Do Translations

Almost everything can be done by using amazing Google services. You can even translate a sentence in another language. A trick like this will prove useful when you are in a different place and don’t understand the language.
Just say ” How to say excuse me in French?”.

7. Call, Text, And Send Email

Google Now can do this too. There is no need to touch your phone now.
Just say “Call John Doe” to call John Doe. If there are multiple numbers for a person, it’ll will ask you which number to call.
Say “Text John Doe” to text John Doe. Next, you’ll have to dictate the content of the text, and then say “Send” to send the text.
Say “Send an email to John Doe” to send an email.

8. Check The Distance

Google Now can tell you the distance between two points i.e. point A and point B. For example, say “What’s the distance between New York and New Jersy?”. Once you get the distance, say ” Show me the route “ or “Give me directions” to get directions.

9. Ask Questions

Google Now can give you an answer for any question. Ask “Who is the author of A song of ice and fire?” or “How long the Inception is?” or “What’s the population of India?”. It can also answer follow up questions. For example, ask ” Who played Ted on How I Met Your Mother?” and it’ll say Josh Radnor. Now ask “How old is he?” and Google Now remember that you are talking about Josh Radnor.

10. Have Fun

You can also use Google Now to have some fun. Here are some of the voice commands:
  • Make me a sandwich
  • Do a barrel roll
  • Tell me a joke
  • What does the fox say?
  • When am I?
  • Beam me up Scotty
  • Who are you?
  • Askew

Circular Fillable Loaders Android Library


This is an Android project allowing to realize a beautiful circular fillable loaders to be used for splashscreen for example.

To make a circular fillable loaders add CircularFillableLoaders in your layout XML and add CircularFillableLoaders library in your project or you can also grab it via Gradle:
compile 'com.mikhaellopez:circularfillableloaders:1.0.0'


            app:wave_color="#3f51b5" />
You must use the following properties in your XML to change your CircularImageView.


  • app:progress (integer) -> default 0
  • app:border (boolean) -> default true
  • app:border_width (dimension) -> default 4dp
  • app:wave_color (color) -> default BLACK
  • app:wave_amplitude (float) -> default 0.05f (between 0.00f and 0.10f)


CircularFillableLoaders circularFillableLoaders = (CircularFillableLoaders)findViewById(;
// Set Progress
// Set Wave and Border Color
// Set Border Width
circularImageView.setBorderWidth(10 * getResources().getDisplayMetrics().density);
// Set Wave Amplitude (between 0.00f and 0.10f)

FAB-Loading Bar in Android - A new way of Loading bar

Floating action button (FAB) is simply a circle button with rounded shadow, floats above the UI and used to display any promoted action, like adding a new item, compose mail, etc.

And here in this tutorial you are going to see new way of Loading bar- A loading animation based on Floating Action Button. Lets deep dive with this cool stuff.

Include the LoadingView widget in your view:

    android:scaleX="1.5" //(optional)
    android:scaleY="1.5" // (optional)
    app:mfl_onclickLoading="true" //(optional)
    app:mfl_duration="200" //(optional)
    app:mfl_repeat="4" //(optional)

Add your loading items.

Note that there are four types of loading animation, LoadingView.FROM_LEFT, LoadingView.FROM_TOP, LoadingView.FROM_RIGHT, LoadingView.FROM_BOTTOM.

  mLoadingView = (LoadingView) findViewById(;

  //also you can add listener for getting callback (optional)
  mLoadingView.addListener(new LoadingView.LoadingListener() {
         @Override public void onAnimationStart(int currentItemPosition) {

         @Override public void onAnimationRepeat(int nextItemPosition) {

         @Override public void onAnimationEnd(int nextItemPosition) {

Call mLoadingView.startAnimation(); whenever you want to start animation.

XML Attributes

XML Attribute
Related Method
Start animation by clicking FAB. (default is false)
setDuration(int duration)
Set duration for each loading item. (default is 500 millis)
setRepeat(int repeat)
For values greater than 1, it calls next animations automatically for 'repeat-1' times. (default is 1)

Download Source Code

Android-based Smart Glass Round-Up — What’s New at CES 2016

While smart glasses aren’t really the coolest kid on the block anymore, they’ve come a long way since their humble beginnings. Prices are dropping, the app ecosystem is evolving, and with more companies investing heavily into VR (Virtual Reality, think the Oculus Rift/Samsung Gear VR and HTC Vive), AR (Augmented Reality) seems poised to make a comeback by piggybacking off the renewed VR craze.

While we have yet to see commercial Google Glass or Microsoft HoloLens devices, many smaller companies are hoping to become the next Oculus Rift and capture the AR market by storm. At this year’s CES in Las Vegas, we took a look at some of the latest AR smart glass offerings that are based on some form of Android.


Certainly innovative in the AR space, the ORA-X by Optinvent combines the smart glass with the headphone so you can watch videos, listen to podcasts, or listen to music and do more on to go. Sure, you can certainly do the same on the Google Glass, but even if you buy the earbud accessory the sound quality just won’t be the same. And thanks to the likes of Beats by Dr. Dre, the social stigma against wearing headphones in public has pretty much disappeared (audiophiles, please don’t hurt me!)

ORA-X Prototype

ORA-X Prototype

Navigation Buttons on the ORA-X

Navigation Buttons on the ORA-X

ORA-X Glass Module

ORA-X Glass Module

Unlike Google’s Glass, which runs a modified version of Android, the ORA-X runs a full version of Android 4.4.2 KitKat (though the team states it may be updated once the product launches). You’re essentially walking around with an Android tablet around your head, where you’ll have full access to the entire gamut of compatible apps on the Google Play Store. The headset/glass hybrid even has the standard Android navigation buttons on the side so you won’t have much trouble browsing the user interface. If tapping the side of your headphone isn’t your cup of tea, you’ll also be able to use voice commands to perform actions on the device.

Specifications :

Hardware Overview

Hardware Overview

Audio Specs

Audio Specs

Display Specs

Display Specs

Misc Specs

Misc Specs
On the hardware side, the device comes packed with a wide range of features that will satisfy any casual user. It features a sizable 2,000 mAh battery, which the company promises will last you between 6-8 hours of continuous use (though much like the Google Glass, certain intensive tasks such as video recording will eat up your battery life). The device also packs 2GBs of RAM, which seems low when compared to the 4GB flagship phones launching this year but should be okay given that multitasking isn’t really something most people would do while wearing one of these. ORA-X also features a trackpad and 9-axis position sensor, enabling you to do some light gaming while on the go.  Combine that with 8GB of ROM and you’ll find plenty of room for most basic apps and games (though you might have trouble running Grand Theft Auto III).

The device has recently been funded thanks to a successful crowd-funding campaign on Indiegogo and is expected to launch sometime during the summer of 2016. At retail, the device will set you back $600, which is less than half the price of the current Google Glass Explorer Edition (though to be fair, we have no idea what the final consumer version of Glass will cost, if and when it comes). Overall, when you compare the ORA-X to the second generation Google Glass, the hardware is definitely up to par, but the fact that the ORA-X runs a full version of Android gives you more freedom in the kinds of apps you want to run.

Telepathy Walker

As a Tokyo-based start-up, Telepathy Japan, Inc. is an AR device primarily aimed at gaming. Following the launch of the company’s first wearable product in Japan, the Telepathy Jumper, the Telepathy Walker is planned to be released internationally in the summer of 2016 for about $700 following a crowd-funding campaign that starts sometime in February. Details on the device are sparse, but I’ve pieced together as much information as I could after speaking with the company’s marketing manager at CES.

Telepathy Japan’s Walker, like the ORA-X, runs a full version of Android 4.4 KitKat. However, the device apparently has some issues displaying apps not designed for its landscape interface, so in order to improve the user experience they plan on creating an app store of their own to feature compatible apps. The company’s Jumper product also pushes for third-party developer support, so this is not unexpected for the product. Telepathy includes a few VR apps via its partners, and promises support for additional VR apps and games such as Ingress when it launches.

Telepathy Walker, Top View

Telepathy Walker, Top View

Telepathy’s Marketing Manager, with the Walker

Ryutaro, Telepathy's Marketing Manager, with the Walker

Telepathy Walker, Front View

Telepathy Walker, Front View

As for the hardware, the Walker is probably the smallest smart glass product I’ve seen yet. The company’s push for really making it a device you’ll actually walk around with in public seems to work. The Walker is light, sleek, and connects magnetically to a light headband that rests on your ears in order to hold the device up. Technical specifications were not made available at the time, but it’s safe to assume that the device features most of the standard sensors and functionality you would expect in an AR smart glass product. We know from the company’s press release that the device has WiFi and bluetooth functionality, a 960×540 resolution display, and an unspecified 5MP camera, so it doesn’t seem to have a glaring lack of any crucial features.

However, the battery life only lasts 2 hours at this time according to the marketing manager, which is a bit disappointing for a device you’re supposed to walk around with all day. The company has said they will work on improving it, which they definitely have time to do so before its launch.


You can’t teach an old dog new tricks, but that doesn’t mean you can’t teach your dog to get better at doing old tricks, right? Okay, I will admit that was bad, but my point is that unlike the other players on the block, Vuzix is no stranger to the AR space. They were the pioneers in the industry, and have already launched several successful AR products primarily aimed at the enterprise market. Vuzix is beefing up their product line this year with a slew of powerful new products. In particular, the M300 Smart Glasses and the 3000 Series, launching this summer and fall respectively, will introduce new ways for enterprise users to enhance their workflow.

Wearing a smart glass so far hasn’t really felt like you were wearing a pair of glasses. That’s mainly because, while the devices do rest on your ear and can fit snugly over your glasses, the device itself doesn’t really look like the lens of an actual pair of glasses. Vuzix is trying to change that with their new product, the M3000, featuring 1.4mm thick lenses. The company leverages its optical technology to create a pair of truly see-through smart glass lens, allowing you to see both the augmented reality without restricting your view of the real world.

M300 Smart Glasses

M300 Smart Glasses

M3000 with “Waveguide Optics”

M3000 with "Waveguide Optics"

The company has provided a full list of specifications on their website, but some interesting things to note are that both the M300/M3000 are running a full version of Android 6.0 Marshmallow on an Intel Atom CPU with 2GBs of RAM and 16GBs of ROM. Although the devices feature only a 100 mAh onboard battery, they mostly operate off of a hot-swappable 5,000 mAh external battery pack. Given that these devices are aimed for enterprise use, it makes sense for them to focus on providing external battery power rather than stuffing a larger, but bulkier internal battery inside the devices. Indeed, the company’s goal to capture the enterprise market is reflected in its curated app store (which features a mix of enterprise and consumer apps) as well as in the company’s M100/M300 Migration Package which allows owners of the M100 to upgrade to the M300 at a reduced price. On a hardware basis and software basis, it already seems like the M300/M3000 is ahead of the pack. The M300 and M3000 are aimed at a summer 2016 commercial launch, however you can pre-order the M300 starting on February 1st on the Vuzix website.

If you would rather wait for an actual pair of smart glasses though, then you’ll be happy to know that Vuzix’s 3000 Series will be launching a few months after the M300/M3000 sometime during the fall of 2016. Fewer technical details are revealed in the product’s technical page, but we know that the device is based off of the same waveguide optics technology that allows the M3000 smart glass lens to be see-through. Combined with the company’s “Cobra” display-engine, Vuzix claims the device is “almost indistinguishable from regular sports sunglasses.” We know from the page that these glasses also run off on Marshmallow, but we do not yet know exactly what processor, storage capacity, battery, and sensors the 3000 series will pack. There are three models in this series, with two of them aimed at watching videos and one aimed for AR. The VidWear B3000 models allow you to watch videos from an HDMI video input (seriously) or wirelessly via bluetooth/WiFi, depending on which model you buy. As with the M300/M3000 smart glasses, the 3000 series will operate primarily off of an external, hot-swappable battery pack. We will be following news on this device as additional details are revealed by the company, but for now, the 3000 series is a cool glimpse into the future intersection between smart glasses and real eyewear.

Top 10 Best Automated Testing Tools and Frameworks for Java

Most programmers spend a lot of time debugging Java code. For most of us, it is not a complex task, but writing test cases manually is a time consuming process.

Fortunately, there are many automated testing tools and frameworks that can significantly improve the workflow of Java development.

When you release an untested product in a production environment, the reviews you will get might not be fruitful. The best idea is to apply a testing phase on your software. These testing code should be written before the software code. This is to ensure that the application code we have written works properly.

Manual testing can be replaced with automated testing to save time and avoid repetitive task. Automated testing is a technique where we use 3rd party software to execute test, report outputs and compare final results. These automated tools allow us to spend more time creating the logic of testing code.

1. Arquillian

Arquillian is an extensible testing platform for JVM that lets you easily create automated integration, functional and acceptance tests. You can test in run-time, and even run your Arquillian tests right alongside unit tests in your IDE. Your software and tests can share the same programming model, regardless of technology stack.

You can just drop a breakpoint in the test or software code and debug the test. You name the container, Arquillian will manage it by bundling test cases, deploying archives, executing tests in the containers, capturing results and creating reports.

2. Parasoft Jtest

Parasoft Jtest is a static analysis tool and automated testing software that can be used for unit-test case generation and execution, regression testing and detection run-time error. The software is used by many big companies including Cisco Systems and Wipro Technologies.

The static analysis automatically checks the code against hundreds of custom rules in order to eliminate entire class of programming errors. Currently, it supports Eclipse IDE, Maven, Ant, CruiseControl and IBM Rational Application Developer.

3. The Grinder

The Grinder makes it easy to run a distributed test using several load injector machines. All test scripts are written using a dynamic scripting language. The default language is Jython (Java implementation of Python).

It can test anything that contains Java API such as HTTP web servers, SOAP and REST web services, application servers and custom protocols. The mature HTTP supports automatic connection of client, cookies, SSL and connection throttling. Also, the graphical console allows multiple load injectors to be monitored and controlled.

4. Marathon

Marathon is a GUI testing tool for Java/Swing applications. It comes with a recorder, runner and editor. The test scripts can be in Ruby or Python code. The aim is to produce test scripts that are readable by everyone on the project.

Marathon has some great features that make it different from other testing framework – integrated debugger and script console, powerful object recognition, batch runner for unattended execution of test scripts, extract-method refactoring, resilient test suites, application launcher to test your source code, and more.

5. TestNG

TestNG is a Java testing framework inspired by NUnit and JUnit. Like other tools, it supports unit, functional, end-to-end and integration testing. You can use it in Eclipse, IDEA, Selenium, Maven and Ant.

There are some functionalities that make it more powerful and easier to use, such as running tests in arbitrarily big thread pools, code testing in a multi thread safe, flexible test configuration, dependent methods for application server testing, support for data-driven testing and parameters, and more.

6. Jameleon

Jameleon is extensible, data-driven automated testing framework that can be easily used by non-technical users. It separates application into features and allows those features to be tied together independently, creating test cases. These cases are then used to automate testing and to generate manual test case documentation.

The tool is broken up into different layers that can be learned by people having different skills. And because it is based on Java and XML, there is no need to learn a proprietary technology.

7. Abbot

Abbot helps you test your Java UI. It consists of Costello (script editor) which allows you to easily launch, explore and control an application. You can use this framework with both scripts and compile code.

Costello can record user actions and facilitate script construction and maintenance. It provides a hierarchy browser which displays all components in use by the program code, as well as meta data about any component selected in the hierarchy.

8. Cactus

Cactus is a unit testing framework for server-side Java code such as Servlets, EJBs, Filters. The aim is to lower the cost of writing tests for server-side code using JUnit framework.

Cactus implements an in-container strategy (tests are executed inside the container) and supports testing for View layer through integration with HttpUnit. Moreover, it provides a good middle ground in term of test granularity. The tool is not being updated anymore.

9. JWalk

Yet another unit testing toolkit designed to support a testing mechanism called Lazy Systematic Unit Testing. The JWalk tools propose all the significant test cases systematically, predict outcomes for many more test cases and generate new tests after a class’s configuration has changed.

The software consists of many tools including JWalktester, JWalker test engine, CustomGenerator, JWalkEditor, JWalkUtility and JWalkMarker. You’ll need a valid JWalk license (an electronic certificate, free of charge) to run the software.

10. JUnit

JUnit is a unit testing framework developed for Java programming language. It is one of a family of xUnit originated with SUnit. It is linked as a JAR at compile time and can be used to write repeatable tests.

There is a set of rules that allows flexible addition or redefinition of the behavior of each test method in a test class. You can write tests for projects related to scientific experiments using randomly generated data. Moreover, you can group your test together for easier test filtering.

How Password storage works in Android M

While Android has received a number of security enhancements in the last few releases, the lockscreen (also know as the keyguard) and password storage have remained virtually unchanged since the 2.x days, save for adding multi-user support. Android M is finally changing this with official support for fingerprint authentication. While the code related to biometric support is currently unavailable, some of the new code responsible for password storage and user authentication is partially available in AOSP's master branch.

Examining the runtime behaviour and files used by the current Android M preview reveals that some password storage changes have already been deployed. This post will briefly review how password storage has been implemented in pre-M Android versions, and then introduce the changes brought about by Android M.

Keyguard unlock methods

Stock Android provides three keyguard unlock methods: pattern, PIN and password (Face Unlock has been rebranded to 'Trusted face' and moved to the proprietary Smart Lock extension, part of Google Play Services). The pattern unlock is the original Android unlock method, while PIN and password (which are essentially equivalent under the hood) were added in version 2.2. The following sections will discuss how credentials are registered, stored and verified for the pattern and PIN/password unlock methods.

Pattern unlock

Android's pattern unlock is entered by joining at least four points on a 3×3 matrix (some custom ROMs allow a bigger matrix). Each point can be used only once (crossed points are disregarded) and the maximum number of points is nine. The pattern is internally converted to a byte sequence, with each point represented by its index, where 0 is top left and 8 is bottom right. Thus the pattern is similar to a PIN with a minimum of four and maximum of nine digits which uses only nine distinct digits (0 to 8). However, because points cannot be repeated, the number of variations in an unlock pattern is considerably lower compared to those of a nine-digit PIN. As pattern unlock is the original and initially sole unlock method supported by Android, a fair amount of research has been done about it's (in)security. It has been shown that patterns can be guessed quite reliably using the so called smudge attack, and that the total number of possible combinations is less than 400 thousand, with only 1624 combinations for 4-dot (the default) patterns.
Android stores an unsalted SHA-1 hash of the unlock pattern in /data/system/gesture.key or /data/system/users/<user ID>/gesture.key on multi-user devices. It may look like this for the 'Z' pattern shown in the screenshot above.
$ od -tx1 gesture.key
0000000 6a 06 2b 9b 34 52 e3 66 40 71 81 a1 bf 92 ea 73
0000020 e9 ed 4c 48
Because the hash is unsalted, it is easy to precompute the hashes of all possible combinations and recover the original pattern instantaneously. As the number of combinations is fairly small, no special indexing or file format optimizations are required for the hash table, and the grep and xxd commands are all you need to recover the pattern once you have the gesture.key file.
$ grep `xxd -p gesture.key` pattern_hashes.txt
00010204060708, 6a062b9b3452e366407181a1bf92ea73e9ed4c48

PIN/password unlock

The PIN/password unlock method also relies on a stored hash of the user's credential, however it also uses a 64-bit random, per-user salt. The salt is stored in the locksettings.db SQLite database, along with other settings related to the lockscreen. The password hash is kept in the /data/system/password.key file, which contains a concatenation of the password's SHA-1 and MD5 hash values. The file's contents may look like this:
$ cat password.key && echo
Note that the hashes are not nested, but their values are simply concatenated, so if you were to bruteforce the password, you only need to attack the weaker hash -- MD5. Another helpful fact is that in order to enable password auditing, Android stores details about the current PIN/password's format in the device_policies.xml file, which might look like this:
<policies setup-complete="true">
<active-password length="6" letters="0" lowercase="0" nonletter="6" 
                 numeric="6" quality="196608" symbols="0" uppercase="0">
If you were able to obtain the password.key file, chances are that you would also have the device_policies.xml file.
This file gives you enough information to narrow down the search space considerably when recovering the password by specifying a mask or password rules. For example, we can easily recover the following 6-digit pin using John the Ripper (JtR) in about a second by specifying the ?d?d?d?d?d?d mask and using the 'dynamic' MD5 hash format (hashcat has a dedicated Android PIN hash mode), as shown below . An 8-character (?l?l?l?l?l?l?l?l), lower case only password takes a couple of hours on the same hardware.
$ cat lockscreen.txt

$ ./john --mask=?d?d?d?d?d?d lockscreen.txt
Loaded 1 password hash (dynamic_1 [md5($p.$s) (joomla) 128/128 AVX 480x4x3])
Will run 8 OpenMP threads
Press 'q' or Ctrl-C to abort, almost any other key for status
456987           (user)
1g 0:00:00:00 DONE  6.250g/s 4953Kp/s 4953Kc/s 4953KC/s 234687..575297
Android's lockscreen password can be easily reset by simply deleting the gesture.key and password.key files, so you might be wondering what is the point in trying to bruteforce it. As discussed in previous posts, the lockscreen password is used to derive keys that protect the keystore (if not hardware-backed), VPN profile passwords, backups, as well as the disk encryption key, so it might be valuable if trying to extract data from any of these services. And of course, the chance that a particular user is using the same pattern, PIN or password on all of their devices is quite high.

Gatekeeper password storage

We briefly introduced Android M's gatekeeper daemon in the keystore redesign post in relation to per-key authorization tokens. It turns out the gatekeeper does much more than that and is also responsible for registering (called 'enrolling') and verifying user passwords. Enrolling turns a plaintext password into a so called 'password handle', which is an opaque, implementation-dependent byte string. The password handle can then be stored on disk and used to check whether a user-supplied password matches the currently registered handle. While the gatekeeper HAL does not specify the format of password handles, the default software implementation uses the following format:
typedef uint64_t secure_id_t;
typedef uint64_t salt_t;

static const uint8_t HANDLE_VERSION = 2;
struct __attribute__ ((__packed__)) password_handle_t {
    // fields included in signature
    uint8_t version;
    secure_id_t user_id;
    uint64_t flags;

    // fields not included in signature
    salt_t salt;
    uint8_t signature[32];

    bool hardware_backed;

Here secure_id_t is randomly generated, 64-bit secure user ID, which is persisted in the /data/misc/gatekeeper directory in a file named after the user's Android user ID (*not* Linux UID; 0 for the primary user). The signature format is left to the implementation, but AOSP's commit log reveals that it is most probably scrypt for the current default implementation. Other gatekeeper implementations might opt to use a hardware-protected symmetric or asymmetric key to produce a 'real' signature (or HMAC).

Neither the HAL, nor the currently available AOSP source code specifies where password handles are to be stored, but looking through the /data/system directory reveals the following files, one of which happens to be the same size as the password_handle_t structure. This implies that it likely contains a serialized password_handle_t instance.
# ls -l /data/system/*key
-rw------- system   system         57 2015-06-24 10:24 gatekeeper.gesture.key
-rw------- system   system          0 2015-06-24 10:24 gatekeeper.password.key
That's quite a few assumptions though, so time to verify them by parsing the gatekeeper.gesture.key file and checking if the signature field matches the scrypt value of our lockscreen pattern (00010204060708 in binary representation). We can do so with the following Python code:
$ cat
N = 16384;
r = 8;
p = 1;

f = open('gatekeeper.gesture.key', 'rb')
blob =

s = struct.Struct('<'+'17s 8s 32s')
(meta, salt, signature) = s.unpack_from(blob)
password = binascii.unhexlify('00010204060708');
to_hash = meta
to_hash += password
hash = scrypt.hash(to_hash, salt, N, r, p)

print 'signature  %s' % signature.encode('hex')
print 'Hash:      %s' % hash[0:32].encode('hex')
print 'Equal:     %s' % (hash[0:32] == signature)

signature: 3d1a20985dec4bd937e5040aadb465fc75542c71f617ad090ca1c0f96950a4b8
Hash:      3d1a20985dec4bd937e5040aadb465fc75542c71f617ad090ca1c0f96950a4b8
Equal: True
The program output above leads us to believe that the 'signature' stored in the password handle file is indeed the scrypt value of the blob's version, the 64-bit secure user ID, and the blob's flags field, concatenated with the plaintext pattern value. The scrypt hash value is calculated using the stored 64-bit salt and the scrypt parameters N=16384, r=8, p=1. Password handles for PINs or passwords are calculated in the same way, using the PIN/password string value as input.

With this new hashing scheme patterns and passwords are treated in the same way, and thus patterns are no longer easier to bruteforce. That said, with the help of the device_policies.xml file which gives us the length of the pattern and a pre-computed pattern table, one can drastically reduce the number of patterns to try, as most users are likely to use 4-6 step patterns (about 35,000 total combinations) .

Because Androd M's password hashing scheme doesn't directly use the plaintext password when calculating the scrypt value, optimized password recovery tools such as hashcat or JtR cannot be used directly to evaluate bruteforce cost. It is however fairly easy to build our own tool in order to check how a simple PIN holds against a brute force attack, assuming both the device_policies.xml and gatekeeper.password.key files have been obtained. As can be seen below, a simple Python script that tries all PINs from 0000 to 9999 in order takes about 10 minutes, when run on the same hardware as our previous JtR example (a 6-digit PIN would take about 17 hours with the same program). Compare this to less than a second for bruteforcing a 6-digit PIN for Android 5.1 (and earlier), and it is pretty obvious that the new hashing scheme Android M introduces greatly improves password storage security, even for simple PINs. Of course, as we mentioned earlier, the gatekeeper daemon is part of Android's HAL, so vendors are free to employ even more (or less...) secure gatekeeper implementations.
$ time ./ gatekeeper.password.key 4
Trying 0000...
Trying 0001...
Trying 0002...

Trying 9997...
Trying 9998...
Trying 9999...
Found PIN: 9999

real    9m46.118s
user    9m6.804s 
sys 0m39.107s

Framework API

Android M is still in preview, so framework APIs are hardly stable, but we'll show the gatekeeper's AIDL interface for completeness. In the current preview release it is called IGateKeeperService and look likes this:
interface android.service.gatekeeper.IGateKeeperService {

    void clearSecureUserId(int uid);

    byte[] enroll(int uid, byte[] currentPasswordHandle, 
                  byte[] currentPassword, byte[] desiredPassword);

    long getSecureUserId(int uid);

    boolean verify(int uid, byte[] enrolledPasswordHandle, byte[] providedPassword);

    byte[] verifyChallenge(int uid, long challenge, 
                           byte[] enrolledPasswordHandle, byte[] providedPassword);
As you can see, the interface provides methods for generating/getting and clearing the secure user ID for a particular user, as well as the enroll(), verify() and verifyChallenge() methods whose parameters closely match the lower level HAL interface. To verify that there is a live service that implements this interface, we can try to call the getSecureUserId() method using the service command line utility like so:
$ service call android.service.gatekeeper.IGateKeeperService 4 i32 0
Result: Parcel(00000000 ee555c25 ea679e08  '....%\U...g.')
This returns a Binder Parcel with the primary user's (user ID 0) secure user ID, which matches the value stored in /data/misc/gatekeeper/0 shown below (stored in network byte order).
# od -tx1 /data/misc/gatekeeper/0
37777776644 25 5c 55 ee 08 9e 67 ea
The actual storage of password hashes (handles) is carried out by the LockSettingsService (interface ILockSettings), as in previous versions. The service has been extended to support the new gatekeeper password handle format, as well as to migrate legacy hashes to the new format. It is easy to verify this by calling the checkPassword(String password, int userId) method which returns true if the password matches:
# service call lock_settings 11 s16 1234 i32 0
Result: Parcel(00000000 00000000   '........')
# service call lock_settings 11 s16 9999 i32 0
Result: Parcel(00000000 00000001   '........')


Android M introduces a new system service -- gatekeeper, which is responsible for converting plain text passwords to opaque binary blobs (called password handles) which can be safely stored on disk. The gatekeeper is part of Android's HAL, so it can be modified to take advantage of the device's native security features, such as secure storage or TEE, without modifying the core platform. The default implementation shipped with the current Android M preview release uses scrypt to hash unlock patterns, PINs or passwords, and provides much better protection against bruteforceing than the previously used single-round MD5 and SHA-1 hashes.