Categories
Android Source Code

Hello Moto 360 from Delphi XE7

Moto 360I really like my Moto 360 watch. It looks fabulous, and does great as an extension of my Android phone, but of course the most important question is how to make an app for it. Turns out it just works with RAD Studio X7, Delphi or C++. Thanks to the new FireUI Multi-Device designer I can actually create a custom view in the UI to make designing the perfect user interface a breeze. Here are some of the details of what I discovered along the way, followed by a download of my sample and the custom view.

The bottom line is it just works, which really isn’t a surprise. Most new Android devices use ARMv7 and the NEON instruction set. (NEON is kind of like the MMX standard on the Intel platform. At first not everyone used those instructions, but once they caught on, everyone did.) So it is no surprise that the Moto 360 does too. Unlike some of the other watches, the Moto 360 does not have a micro USB port. So you have to use ADB over BlueTooth. This requires a few extra steps to setup, and is really slow to deploy over. So slow I canceled the first two deployments because I thought I set something up wrong.

First of all, the Moto 360 display is not perfectly round. It has a flat area on the bottom. If you look closely you can see the light sensor there. Not sure if that was why it wasn’t round, or if there was another design reason. In any case, the screen resolution is 320 x 290 pixels at 213 Pixels Per Inch. This means at design time you have a usable area of 240 x 218 pixels. This is the information we need to create a custom view. Just put the following code in a package.

  TDeviceinfo.AddDevice(TDeviceinfo.TDeviceClass.Tablet, ViewName,
    // The Moto360 is 320x290 phyiscal and 240x218 logical with 213 PPI
    TSize.Create(320, 290), TSize.Create(240, 218),
    TOSVersion.TPlatform.pfAndroid, 213,
    True); // Exclusive

The Device class enumeration actually has a Watch class, but looking in the code that detects the class at runtime and it doesn’t know how to detect a watch yet. So it defaults to Tablet. It makes sense if you think about the fact that XE7 was released before the Moto 360. I imagine an update will address this.

The requirement to get the custom view to show up in the IDE is you need to update the XML file found at %AppData%\Roaming\Embarcadero\BDS\15.0\MobileDevices.xml to reference the new view. Inside the MobileDevices element, add the following:

  <MobileDevice>
    <Displayname>Moto360</Displayname>
    <Name>Moto360</Name>
    <DevicePlatform>3</DevicePlatform>
    <FormFactor>2</FormFactor>
    <Portrait Enabled="True" Width="240" Height="218" Top="102" Left="29" StatusbarHeight="0" StatusBarPos="0" Artwork="C:\Users\jim\Documents\Embarcadero\Studio\Projects\HelloMoto360\Moto360.png" />
    <UpsideDown Enabled="False" Width="240" Height="218" Top="0" Left="0" StatusbarHeight="0" StatusBarPos="0" Artwork="" />
    <LandscapeLeft Enabled="False" Width="240" Height="218" Top="0" Left="0" StatusbarHeight="0" StatusBarPos="0" Artwork="" />
    <LandscapeRight Enabled="False" Width="240" Height="218" Top="0" Left="0" StatusbarHeight="0" StatusBarPos="0" Artwork="" />
  </MobileDevice>

You’ll need to update the path to that Artwork to point to the correct location of the PNG on your system. Or you can just leave it blank. Here is what it all looks like when setup in the IDE.

Hello Moto 360 in the XE7 IDE

You’ll notice a red circle on the design surface. I added this to see where the corners are (since the display is round). At runtime you can just barely see the red if you hold the watch right. In production I’d hide this at runtime. I placed the TCircle at -1, -1 and set the size to 242 x 242. This way the circle follows the bezel and not the display area of the screen. I suppose if I bumped it out another pixel it would disappear completely at runtime.

To get the Moto 360 to show up as a target device you first need to enable Bluetooth debugging.

  1. Hold the side button in until Settings appears
  2. Swipe down to About and tap it.
  3. Tap on build number until it tells you that you are a developer.
  4. Swipe back to settings and then tap on Developer options.
  5. Tap on ADB Debugging until it says Enabled.
  6. Tap on Debug over Bluetooth until it says Enabled.
  7. On your paired phone, go into Android Wear settings (gears in the upper right)
  8. Enable Debugging over Bluetooth.
    1. It should show
      • Host: disconnected
      • Target: connected
    2. Target is the watch, Host is your computer.

Then you connect your phone that is connected to the Moto 360 via USB and run the following commands (assuming ADB is on your system path) to connect via Bluetooth. I made a batch file.

 @echo off
 REM optional cleaning up
 adb disconnect localhost:4444
 adb -d forward --remove-all
 REM this is the connection
 adb -d forward tcp:4444 localabstract:/adb-hub
 adb -d connect localhost:4444
 REM these lines are to see if it worked
 echo Here is the forwarded ports . . . .
 adb forward --list
 echo.
 echo Wait a second for it to connect . . . .
 pause
 adb devices

The ADB Devices list should show something like

List of devices attached 
123456abcd device 
localhost:4444 device

Now the Android Wear app on your phone should show

  • Host: connected
  • Target: connected

Be sure that your Moto 360 app uses the unit that defines the Moto 360 device (from your package). This way your app can select it at runtime. If you do all that, you’ll see something similar to this with it running on the Moto 360:

Hello Moto 360 from Delphi XE7

My camera had a hard time focusing on it, but rest assured it looks fabulous! I tried C++ too, and no surprises, it works as well. More experimenting to do, but it is nice to know I have a tool that will take me everywhere I want to go.

If you don’t have a Moto 360, you can setup an Android emulator (AVD) instead. I did that before mine showed up. You need to download the Android 4.4W (API20) SDK Platform and ARM System image.

Android Wear SDK Download

Then create an AVD with the new Emulator.

Android Wear AVD Settings

It actually gives you the rectangle screen with a round bezel. Also it is 320 x 320 (so completely round) and 240 PPI. This means the view I created (since it was exclusive) won’t work on the emulator. You’ll need to create a new custom view for the emulator, but I’ll leave that up to to.

you can download all of my code for the custom view, Bluetooth ADB batch file, and sample apps from Github. (Update: Added a view for Galaxy Gear Live & LG-G) BTW, XE7 adds local Git support, which is another great new feature. Download the trial and check it out.

Categories
Android devices iOS Mobile

The FireUI: Multi-Device Designer in RAD Studio XE7

Here is the video replay, slides and resources from my Developer Skill Sprint on the new Multi-Device Designer in RAD Studio XE7. This is one part of the new FireUI, the evolution of FireMonkey.

The Multi-Device Designer is a new feature in Appmethod, RAD Studio, Delphi and C++Builder XE7 that makes it easy to maximize the reuse of your visually designed forms across devices, while also getting the most flexibility and customization as possible.

Design your UI once for Windows, OS X, iOS and Android, then customize it for different screen sizes: iPad, iPhone, Tablet, Google Glass, Surface Pro, etc.

You can view the slides on Google Docs.

Check out the Guided Tour on the Welcome Page and the following DocWiki pages:

Check out the other skill sprints too. . .

Categories
Android Mobile

What About Blackberry?

One of the most common questions we get when we talk about new features in Delphi, C++Builder and RAD Studio is “What about Blackberry?” which is almost as common as similar questions about Windows Phone or Linux. iOS and especially Android rule the smartphone OS market, but Blackberry still has a place on most charts (unlike Symbian and some others).

IDC: Smartphone OS Market Share 2013, 2012, and 2011 ChartWell, now RAD Studio XE6, Delphi, C++Builder and Appmethod support 96.8% of the shipping platforms thanks to the latest update to Blackberry 10 (10.2.1 or later), it now supports running Native Android APK apps without needing to port. I tested on a Z10 developer device, but it should work on Q10, Q5, Z30, or others. To be clear, Blackberry still runs their own OS, but that OS is able to run Native Android Apps.

Our IDE doesn’t recognize the Blackberry device, again because it is not running Android. But once you build your APK you can transfer it to the Blackberry device using whatever method is most convenient for you. I used Dropbox. Once you have the APK on the Blackberry you simply need to install it.

I built a few samples, including one that takes a picture, and they all more or less worked as expected. When the ShareSheet came up, the usual suspects like Facebook and Twitter were not there, but I didn’t have those set up yet on my test device, so that is to be expected.

You can take things a step further and repackage and sign your app to distribute through the Blackberry store, but that isn’t necessary. You can deploy your APK directly to the Blackberry, or distribute it through the Amazon App Store. Crackberry has a guide on installing APKs too, with a little more detail.

The Blackberry Developer site has useful pages:

Categories
Android iOS Mobile REST Source Code webinar

Mobile Summer School 6: REST & BaaS

Here are the slides and downloads from my mobile summer school session on REST & BAAS. If you just want the slides they are on Slide Share. I’ll post the video and more details here later.

For more information on BaaS, check out Sarina’s excellent series of blog posts.

Categories
Android Mobile Source Code Tools webinar

Skill Sprint: Android Voice – Speech Recognition and TTS

Androids can talk and listen!For my Developer Skill Sprint I was originally scheduled to show how to do a Google Glass Voice Trigger. That is pretty cool because it allows you to launch a Google Glass app with your voice, but I decided to expand on that to also show how the Google Glass app can be launched with the results of additional voice input, as well as how to take dictation and do text to speech everywhere else in Android.

I’ve still got a lot of work to do on the components, but they work as is for now. If you want to modify the component code then take a look at my Skill Sprint and blog post on the Android JNI Bridge.

 

Categories
Android Mobile MVP News Source Code Tools webinar

Android JNI Bridge and Custom Classes.dex

By creating a custom Classes.dex you can get access to 3rd party Java JAR APIs from your application. For my Integrate More Android with a JNI Call to your Android App Developer Skill Sprint I created a demo app that demonstrates creating a custom Classes.dex. This is a new feature in XE6 and Appmethod 1.14. [Download the demo] [Download the slides] The Demo app uses the Base64Coder JAR file (included). To build the demo:

  1. Examine the createdex.bat file to make sure it refers to the correct location for your dx.bat utility and the fmx.jar & android-support-v4.jar files.
  2. Run the createdex.bat file to create the classes.dex file which includes the two jar files above, plus the base64coder.jar file.
  3. Double check that the Deployment Manager references the new classes.dex and not the old ones, and that the remote path is “classes\”
  4. Notice that the android.JNI.Base64Coder.pas file wraps and exposes the methods of the base64coder class.
  5. Run the app on your Android device and verify that it works as expected.

The Base64Coder.JAR is Android specific, so it will not work on iOS or Windows. Some additional notes from the Developer Skill Sprint: Some useful units for making JNI calls

  • Androidapi.Jni – Java Native Interface type definitions
  • Androidapi.JNIBridge – The JNI Bridge
  • Androidapi.JNI.JavaTypes – JString and other common types.
  • Androidapi.Helpers – JStringToString and other useful conversions.
  • FMX.Platform.Android– Useful platform methods like GetAndroidApp, MainActivity and ConvertPointToPixel
  • Others useful units: Androidapi.AppGlue, Androidapi.JNIMarshal, Androidapi.JNI.Embarcadero
  • For more see: C:\Program Files (x86)\Embarcadero\Studio\14.0\source\rtl\android (Object Pascal) and C:\Program Files (x86)\Embarcadero\Studio\14.0\include\android\rtl (C++)

You will want to make use of Conditional Defines in Object Pascal and Predefined Macros in C++. In my blog post on Android Settings I showed how to make a JNI call with Object Pascal, but you can also look at the DeviceInfo Mobile Code Snippet in both C++ and Object Pascal. To create your own JNI Bridge wrappers, look at the source code in C:\Program Files (x86)\Embarcadero\Studio\14.0\source\rtl\android (Object Pascal) and C:\Program Files (x86)\Embarcadero\Studio\14.0\include\android\rtl (C++). You can also consider the following 3rd party utilities:

If you just want to include standard Android APIs then check out the FMXExpress (also an Embarcadero MVP) project on GitHub that includes all the Android APIs. Here is the video replay of my skill sprint

Also, check out Brian Long’s video on accessing the Android API with XE5

Categories
Android Components gadgets iOS Source Code

Parrot AR.Drone 2.0 Delphi Component

githubI took my code I previously used to control the Parrot AR.Drone and turned it into a reusable component. I added some more functionality to it as well, although there is a lot more to cover. The component is available on GitHub.

It should work with Delphi, C++Builder, Appmethod and RAD Studio on iOS, Android, Windows and OS X. I’d love to hear how it works for you and what you use it for!

Categories
Android

Delphi XE6 is on Fire

Remember the demonstration I did showing how Delphi works great with Ouya? Well, there is a new set top box in town, the Amazon Fire TV. Just like the Ouya, it is a Android powered set top box. So, does it support everyone’s favorite Android development tool? Turns out it works great with Delphi XE6. It uses a wireless ADB connection, so you just need to follow their simple steps to Connect ADB and it shows up in the IDE as the Android device AFTB (not sure what that stands for). After that you can develop and deploy to it just like any other Android device.

When you run your app from the IDE it will popup and run on the Fire TV, but after you exit your app you won’t see it anymore. It appears Fire TV only displays apps that are loaded from their app store on the home screen. Not to worry, you can easily launch it from the Settings -> Applications screen. Simply select your app and choose Launch application.

By default your only input device is the Amazon Fire TV remote.

firetvremote

Simply handle the OnKeyUp event on your form and you will receive events with the key codes vkLeft, vkRight, vkUp, vkDown, vkMenu, vkHardwareBack, vkMediaPlayPause and a key value of 0 for the select, fast forward or rewind buttons. The Home and Voice Search buttons are always handled by the OS. If you choose to handle the vkHardwareBack then set the key value to 0 (or any other value) and the system will ignore it, otherwise your app will exit.

Fire TV also supports a gamepad as well as various other bluetooth input devices, like a bluetooth keyboard, which greatly expands the input options. You also could use App Tethering to tether a mobile or desktop/laptop app to your Fire TV app to provide input and display it on the big screen. Like an image receiver that receives pictures from your mobile device to display them to everyone in the room.

If you download the Fire TV SDK it exposes Game Controller and Notification classes. If there is interest I can post some Object Pascal wrappers for those later on.

 

Categories
Android gadgets Mobile

BlueTooth Remote Control Car

How to use BlueTooth is one of the most common requests with Delphi. During our Devices and Gadgets webinar David I. showed how to browse paired BlueTooth devices and connect to them.

But wait, there’s more! Daniele Teti & Daniele Spinetti of bit Time Software created an Android client app to control a BeeWi – BBZ201 – Mini Cooper S Bluetooth Car. It may work with other BeeWi Bluetooth remote control vehicles, but has not been tested with any (that I know of).

The Multitouch code is Copyright (c) 2006-2014 Iztok Kacin, Cromis and used under the BSD license.

It currently doesn’t have BlueTooth discovery, so once you pair your device you need to update the source code with the MAC address of your car.

I’ve created a GitHub repository for the project. It should work with XE5 or AppMethod just fine. I’ve got a version updated to XE6 that exposes the controls via App Tethering too, which I will upload later.

Categories
Android Graphics iOS Mobile Source Code

OpenGL ES Support on Mobile with XE6

OpenGL ES logoAppmethod, RAD Studio, Delphi and C++Builder XE6 all make it really easy to work with OpenGL ES on mobile devices. Under the covers FireMonkey is implemented with OpenGL ES on mobile (iOS & Android), OpenGL on OS X and DirectX on Windows. It provides a number of useful abstractions for working with 2D and 3D graphics, but sometimes you just want to get down to a lower level.

Here is all you need to access an OpenGL ES rendering context in your FireMonkey mobile application. This example is in Object Pascal, but should be easy enough to adapt to C++.

  1. Create a new FireMonkey Mobile application
  2. Select 3D application
  3. Add FMX.Types3D to the Interface uses clause
  4. In the Object Inspector, create a new event handler for the OnRender event for your form
  5. You now have access to the OpenGL render context.

You can work with the TContext3D that is passed in via a parameter, and your code will work across platforms automatically. If you want to work with the OpenGL ES APIs directly you can do that too with the following uses clause in your Implementation section:

uses
  // Gives you access to the FMX wrappers for GLES
  FMX.Context.GLES, 
{$IFDEF ANDROID}
  // Direct access to the Android GLES implementation
  Androidapi.Gles, FMX.Context.GLES.Android;
  // More useful units for Android
  //, FMX.Platform.Android, Androidapi.Gles2, Androidapi.JNI.OpenGL,
  // Androidapi.Glesext, Androidapi.Gles2ext;
{$ENDIF}
{$IFDEF IOS}
  // Direct access to the iOS GLES implementation
  iOSapi.OpenGLES, FMX.Context.GLES.iOS;
  // More useful units for iOS
  //, iOSapi.GLKIT, FMX.Platform.iOS;
{$ENDIF}

And here is an example event handler with a couple calls to the OpenGL ES APIs:

procedure TForm1.Form3DRender(Sender: TObject; Context: TContext3D);
begin
  glClearColor(1, 1, 0, 1);
  glClear(GL_COLOR_BUFFER_BIT);
end;

This accesses the iOS and Android equivalents of the same OpenGL ES APIs. Thanks to the compiler directives, and the cross platform nature of OpenGL ES, this code just works. I’m not an OpenGL expert, but I looked through the OpenGL ES API and all the routines I tested worked, but I never did anything interesting with them.