How to make your app Xperia Play optimized - Xperia Play General

Hello, I am the developer of Zeus Arena, an Xperia Play optimized port of the ioquake3 engine for android.
It was a bit of a headache to get Xperia Play controls working when I first started working on Zeus Arena about 4 or 5 months ago. However recently I decided to try and support more devices with Zeus Arena by adding touch screen controls. Since all of Zeus Arenas graphics where being done in native code, using a native activity no less, this meant I would either have to have two completely separate applications or rewrite most of Zeus Arena. I did the later.
I found an easy way to support Xperia Play controls to an existing application (to those of you unfamiliar with Zeus Arena it is built upon kwaak3). So this post will be a brief tutorial on adding Xperia Play controls to an existing application.
First a note: This tutorial will not tell you how to set up or use the ndk, there are plenty of tutorials for that already. The hardest part of this process should be settting up the ndk.
Adding Xperia Play controls to your existing application:
The main problem with adding Xperia Play controls to your application is the touch pad (if you don't need to support the touch pad ignore this tutorial and look online for SE's tutorial, it's easy). The touchpad requires that you use a native activity to get your input, so the main purpose of this tutorial will be how to use a native activity whilst changing your existing code as little as possible.
First you will need to make a native activity in c code, this activity will poll for input from the touchpad as well as load references to the methods in your android code that deal with the touchpad input. (see sample code below)
Next open your main activity and change it from extending Activity to extend NativeActivity.
Now, and this is the key part really, asap after your call of super.onCreate(savedInstanceState); add this line of code: getWindow().takeSurface(null);
That one magical line of code allows you to add your graphics in your java code, meaning you don't have to change your existing java code any more than this.
However we aren't quite done yet. As mentioned above the native code is getting the input for the touchpad, we probably want to send this to the java code where all the other event handling takes place.
This is simple make a java method that accepts touch pad input and load it up in your native activity and then call it when ever the touch pad is touched.
Example code from Zeus Arena (if you are familiar with Zeus Arena's code (it's open source) the example code won't look too familiar because the code has modified to make it a bit simpler and some of it is from a new update coming to Zeus Arena soon):
java code:
public class Game extends NativeActivity{
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().takeSurface(null);
RegisterThis();
mGLSurfaceView = new KwaakView(this, this); // a custom made view for Zeus Arena
setContentView(mGLSurfaceView);
mGLSurfaceView.requestFocus();
mGLSurfaceView.setId(1);
}
//gives the native activity a copy of this object so it can call OnNativeMotion
public native int RegisterThis();
//loads the .so, change library name to whater your library is called
static {
System.loadLibrary("kwaakjni");
}
//called by the native activity when ever touch input is found
public void OnNativeMotion(int action, int x, int y, int source, int device_id) {
if(source == 1048584){ //touchpad
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
(366-y),
metaState
);
mGLSurfaceView.onTouchPadEvent(motionEvent); //custom made method for dealing with touch input
}
else{
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
y,
metaState
);
// Dispatch touch event to view
mGLSurfaceView.dispatchTouchEvent(motionEvent);
}
}
}
Native code:
#include <dlfcn.h>
#include <stdio.h>
#include <string.h>
#include <android/log.h>
#include <jni.h>
#include <errno.h>
#include <android_native_app_glue.h>
#include <time.h>
#include <unistd.h>
#include "quake_two_android_Quake2.h"
#define EXPORT_ME __attribute__ ((visibility("default")))
static JavaVM *jVM;
typedef unsigned char BOOL;
#define FALSE 0
#define TRUE 1
//|------------------------------------------------------ NATIVE ACTIVITY ------------------------------------------------------|
static jobject g_pActivity = 0;
static jmethodID javaOnNDKTouch = 0;
/**
* Our saved state data.
*/
struct TOUCHSTATE
{
int down;
int x;
int y;
};
/**
* Shared state for our app.
*/
struct ENGINE
{
struct android_app* app;
int render;
int width;
int height;
int has_focus;
//ugly way to track touch states
struct TOUCHSTATE touchstate_screen[64];
struct TOUCHSTATE touchstate_pad[64];
};
void attach(){
}
/**
* Process the next input event.
*/
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
JNIEnv *jni;
(*jVM)->AttachCurrentThread(jVM, &jni, NULL);
struct ENGINE* engine = (struct ENGINE*)app->userData;
if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
{
int nPointerCount = AMotionEvent_getPointerCount( event );
int nSourceId = AInputEvent_getSource( event );
int n;
for( n = 0 ; n < nPointerCount ; ++n )
{
int nPointerId = AMotionEvent_getPointerId( event, n );
int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
int nRawAction = AMotionEvent_getAction( event );
struct TOUCHSTATE *touchstate = 0;
if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
touchstate = engine->touchstate_pad;
else
touchstate = engine->touchstate_screen;
if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
{
int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
}
if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
{
touchstate[nPointerId].down = 1;
}
else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
{
touchstate[nPointerId].down = 0;
}
if (touchstate[nPointerId].down == 1)
{
touchstate[nPointerId].x = AMotionEvent_getX( event, n );
touchstate[nPointerId].y = AMotionEvent_getY( event, n );
}
int handled = 0;
if( jni && g_pActivity ){
//send the event to java code, sends both touch screen and touch pad events, I think the java code will still intercept touch screen events
//so sending them probably isn't needed. If it is needed intercepting key events will be needed in native code as well.
(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0 );
}
}
return 1;
}
return 0;
}
/**
* Process the next main command.
*/
static
void
engine_handle_cmd( struct android_app* app, int32_t cmd )
{
struct ENGINE* engine = (struct ENGINE*)app->userData;
switch( cmd )
{
case APP_CMD_SAVE_STATE:
// The system has asked us to save our current state. Do so if needed
break;
case APP_CMD_INIT_WINDOW:
// The window is being shown, get it ready.
if( engine->app->window != NULL )
{
engine->has_focus = 1;
}
break;
case APP_CMD_GAINED_FOCUS:
engine->has_focus = 1;
break;
case APP_CMD_LOST_FOCUS:
// When our app loses focus, we stop rendering.
engine->render = 0;
engine->has_focus = 0;
//engine_draw_frame( engine );
break;
}
}
/**
* This is the main entry point of a native application that is using
* android_native_app_glue. It runs in its own thread, with its own
* event loop for receiving input events and doing other things (rendering).
*/
void
android_main( struct android_app* state )
{
struct ENGINE engine;
// Make sure glue isn't stripped.
app_dummy();
memset( &engine, 0, sizeof(engine) );
state->userData = &engine;
state->onAppCmd = engine_handle_cmd;
state->onInputEvent = engine_handle_input;
engine.app = state;
//setup(state);
//JNIEnv *env;
//(*jVM)->AttachCurrentThread(jVM, &env, NULL);
if( state->savedState != NULL )
{
// We are starting with a previous saved state; restore from it.
}
// our 'main loop'
while( 1 )
{
// Read all pending events.
int ident;
int events;
struct android_poll_source* source;
// If not rendering, we will block forever waiting for events.
// If rendering, we loop until all events are read, then continue
// to draw the next frame.
while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
//while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
{
// Process this event.
// This will call the function pointer android_app:nInputEvent() which in our case is
// engine_handle_input()
if( source != NULL )
{
source->process( state, source );
}
// Check if we are exiting.
if( state->destroyRequested != 0 )
{
return;
}
usleep(17000); //17 miliseconds
}
}
}
jint EXPORT_ME
JNICALL Java_quake_two_android_Quake2_RegisterThis(JNIEnv * env, jobject clazz){
g_pActivity = (jobject)(*env)->NewGlobalRef(env, clazz);
return 0;
}
jint EXPORT_ME JNICALL
JNI_OnLoad(JavaVM * vm, void * reserved)
{
JNIEnv *env;
jVM = vm;
if((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK)
{
return -1;
}
const char* interface_path = "quake/two/android/Quake2";
jclass java_activity_class = (*env)->FindClass( env, interface_path );
javaOnNDKTouch = (*env)->GetMethodID( env, java_activity_class, "OnNativeMotion", "(IIIII)V");
javaOnNDKKey = (*env)->GetMethodID( env, java_activity_class, "OnNativeKeyPress", "(III)V");
return JNI_VERSION_1_4;
}
Licences:
/*
* Copyright (c) 2011, Sony Ericsson Mobile Communications AB.
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the Sony Ericsson Mobile Communications AB nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/
/*
* This example uses the NDK and a helper library available in the NDK called 'native app glue',
* which is available in %NDK_ROOT/source/android/native_app_glue. If you are new to NDK or to
* the NativeActivity, look through the native_app_glue source to see how you should set up your
* native app and handle callbacks and messages from Android. Note that the callbacks registered
* in the ANativeActivity_onCreate() entry-point must return in a timely manner, as does
* ANativeActivity_onCreate() itself. The Native App Glue does this by creating a pipe() and
* synchronization objects to handle communication between the Android, the NativeActivity and
* the game/sample logic.
*
* In this example, we read the 'pointer' information from touch events from both the touch-screen
* and the touch-pad (if available). We store their positions and state (up or down), then draw
* the touch positions scaled to the screen.
*
* Although we are using hard-coded values for the touch-pad resultion, you can and should read
* those values at runtime in java by enumerating InputDevices and finding the touchpad device.
*
*/
/*
* Kwaak3 - Java to quake3 interface
* Copyright (C) 2010 Roderick Colenbrander
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/

tl;dr
Seriously though, that's awesome. Hopefully this gets implemented and put to good use!

Yeah I just had to scroll down to see if there where any reply's and it is damn long.
Most of it is example code which can be skipped unless your actually going to use it, the first section tells you what you need to know.
Also anyone can take this tutorial and put it anywhere people may want to see it if they want

Sweet, thanks for posting! I've only done a bit of development, and eventually want to get a game going - this is bookmarked and thanked!!!

bigbison said:
... of it is example code which can be skipped unless your actually going to use ...
Click to expand...
Click to collapse
bigbison,
I've also been playing a bit with the native app example. And I still have some questions and didn't find a good place to ask them. So, maybe you can help me out ?
If I understand it correctly ...
The onAppCmd and onInputEvent are called from within the main application thread.
The android_main is running in a seperate thread.
But I don't see any mutexes (or other locking mechanism) in place to prevent simultaneous access to the same application data from both threads. Could it be that your code (and also the standard example code) is not thread-safe?
Are we up for some random crashes ? Or am I missing something ? (I hope so ...)

Dear XDA Admins...
PIN THIS THREAD RIGHT NOW!!!
...thank you.

Nice post, hopefully helps someone....
Just to let you know there is an emoticon in your code that should be fixed....
thanks.

Just wanted to say im a big fan of Zeus Arena. Its really nice.
Very fun to play offline to
Great work man. you got the Xperia controls working perfectly

wiffeltje said:
bigbison,
... Or am I missing something ? (I hope so ...)
Click to expand...
Click to collapse
Just answering my own question.
After posting my question I took yet another look at the android_native_app_glue code and how the ALoop_pollAll interacts with that. And looking at it again, I think I get it.
It looks like the threading is OK and that (most of) the main loop won't be able to run when the call-back functions are running. So, that should be OK.
So, I did overlook some things
Sorry to disturb you a little bit too early with my question.

OnNativeKeyPress
Hi,
I am trying to follow this a little, I can access the touchpad just fine in my nativeactivity but I have lost all access to onKeyDown and onKeyUp using the Sony tutorial. Your code mentions OnNativeKeyPress but I can't see you calling this anywhere, is this expected?
Thanks for any pointers you can offer!
bigbison said:
Hello, I am the developer of Zeus Arena, an Xperia Play optimized port of the ioquake3 engine for android.
It was a bit of a headache to get Xperia Play controls working when I first started working on Zeus Arena about 4 or 5 months ago. However recently I decided to try and support more devices with Zeus Arena by adding touch screen controls. Since all of Zeus Arenas graphics where being done in native code, using a native activity no less, this meant I would either have to have two completely separate applications or rewrite most of Zeus Arena. I did the later.
I found an easy way to support Xperia Play controls to an existing application (to those of you unfamiliar with Zeus Arena it is built upon kwaak3). So this post will be a brief tutorial on adding Xperia Play controls to an existing application.
First a note: This tutorial will not tell you how to set up or use the ndk, there are plenty of tutorials for that already. The hardest part of this process should be settting up the ndk.
Adding Xperia Play controls to your existing application:
The main problem with adding Xperia Play controls to your application is the touch pad (if you don't need to support the touch pad ignore this tutorial and look online for SE's tutorial, it's easy). The touchpad requires that you use a native activity to get your input, so the main purpose of this tutorial will be how to use a native activity whilst changing your existing code as little as possible.
First you will need to make a native activity in c code, this activity will poll for input from the touchpad as well as load references to the methods in your android code that deal with the touchpad input. (see sample code below)
Next open your main activity and change it from extending Activity to extend NativeActivity.
Now, and this is the key part really, asap after your call of super.onCreate(savedInstanceState); add this line of code: getWindow().takeSurface(null);
That one magical line of code allows you to add your graphics in your java code, meaning you don't have to change your existing java code any more than this.
However we aren't quite done yet. As mentioned above the native code is getting the input for the touchpad, we probably want to send this to the java code where all the other event handling takes place.
This is simple make a java method that accepts touch pad input and load it up in your native activity and then call it when ever the touch pad is touched.
Example code from Zeus Arena (if you are familiar with Zeus Arena's code (it's open source) the example code won't look too familiar because the code has modified to make it a bit simpler and some of it is from a new update coming to Zeus Arena soon):
java code:
public class Game extends NativeActivity{
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().takeSurface(null);
RegisterThis();
mGLSurfaceView = new KwaakView(this, this); // a custom made view for Zeus Arena
setContentView(mGLSurfaceView);
mGLSurfaceView.requestFocus();
mGLSurfaceView.setId(1);
}
//gives the native activity a copy of this object so it can call OnNativeMotion
public native int RegisterThis();
//loads the .so, change library name to whater your library is called
static {
System.loadLibrary("kwaakjni");
}
//called by the native activity when ever touch input is found
public void OnNativeMotion(int action, int x, int y, int source, int device_id) {
if(source == 1048584){ //touchpad
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
(366-y),
metaState
);
mGLSurfaceView.onTouchPadEvent(motionEvent); //custom made method for dealing with touch input
}
else{
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
y,
metaState
);
// Dispatch touch event to view
mGLSurfaceView.dispatchTouchEvent(motionEvent);
}
}
}
Native code:
#include <dlfcn.h>
#include <stdio.h>
#include <string.h>
#include <android/log.h>
#include <jni.h>
#include <errno.h>
#include <android_native_app_glue.h>
#include <time.h>
#include <unistd.h>
#include "quake_two_android_Quake2.h"
#define EXPORT_ME __attribute__ ((visibility("default")))
static JavaVM *jVM;
typedef unsigned char BOOL;
#define FALSE 0
#define TRUE 1
//|------------------------------------------------------ NATIVE ACTIVITY ------------------------------------------------------|
static jobject g_pActivity = 0;
static jmethodID javaOnNDKTouch = 0;
/**
* Our saved state data.
*/
struct TOUCHSTATE
{
int down;
int x;
int y;
};
/**
* Shared state for our app.
*/
struct ENGINE
{
struct android_app* app;
int render;
int width;
int height;
int has_focus;
//ugly way to track touch states
struct TOUCHSTATE touchstate_screen[64];
struct TOUCHSTATE touchstate_pad[64];
};
void attach(){
}
/**
* Process the next input event.
*/
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
JNIEnv *jni;
(*jVM)->AttachCurrentThread(jVM, &jni, NULL);
struct ENGINE* engine = (struct ENGINE*)app->userData;
if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
{
int nPointerCount = AMotionEvent_getPointerCount( event );
int nSourceId = AInputEvent_getSource( event );
int n;
for( n = 0 ; n < nPointerCount ; ++n )
{
int nPointerId = AMotionEvent_getPointerId( event, n );
int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
int nRawAction = AMotionEvent_getAction( event );
struct TOUCHSTATE *touchstate = 0;
if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
touchstate = engine->touchstate_pad;
else
touchstate = engine->touchstate_screen;
if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
{
int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
}
if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
{
touchstate[nPointerId].down = 1;
}
else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
{
touchstate[nPointerId].down = 0;
}
if (touchstate[nPointerId].down == 1)
{
touchstate[nPointerId].x = AMotionEvent_getX( event, n );
touchstate[nPointerId].y = AMotionEvent_getY( event, n );
}
int handled = 0;
if( jni && g_pActivity ){
//send the event to java code, sends both touch screen and touch pad events, I think the java code will still intercept touch screen events
//so sending them probably isn't needed. If it is needed intercepting key events will be needed in native code as well.
(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0 );
}
}
return 1;
}
return 0;
}
/**
* Process the next main command.
*/
static
void
engine_handle_cmd( struct android_app* app, int32_t cmd )
{
struct ENGINE* engine = (struct ENGINE*)app->userData;
switch( cmd )
{
case APP_CMD_SAVE_STATE:
// The system has asked us to save our current state. Do so if needed
break;
case APP_CMD_INIT_WINDOW:
// The window is being shown, get it ready.
if( engine->app->window != NULL )
{
engine->has_focus = 1;
}
break;
case APP_CMD_GAINED_FOCUS:
engine->has_focus = 1;
break;
case APP_CMD_LOST_FOCUS:
// When our app loses focus, we stop rendering.
engine->render = 0;
engine->has_focus = 0;
//engine_draw_frame( engine );
break;
}
}
/**
* This is the main entry point of a native application that is using
* android_native_app_glue. It runs in its own thread, with its own
* event loop for receiving input events and doing other things (rendering).
*/
void
android_main( struct android_app* state )
{
struct ENGINE engine;
// Make sure glue isn't stripped.
app_dummy();
memset( &engine, 0, sizeof(engine) );
state->userData = &engine;
state->onAppCmd = engine_handle_cmd;
state->onInputEvent = engine_handle_input;
engine.app = state;
//setup(state);
//JNIEnv *env;
//(*jVM)->AttachCurrentThread(jVM, &env, NULL);
if( state->savedState != NULL )
{
// We are starting with a previous saved state; restore from it.
}
// our 'main loop'
while( 1 )
{
// Read all pending events.
int ident;
int events;
struct android_poll_source* source;
// If not rendering, we will block forever waiting for events.
// If rendering, we loop until all events are read, then continue
// to draw the next frame.
while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
//while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
{
// Process this event.
// This will call the function pointer android_app:nInputEvent() which in our case is
// engine_handle_input()
if( source != NULL )
{
source->process( state, source );
}
// Check if we are exiting.
if( state->destroyRequested != 0 )
{
return;
}
usleep(17000); //17 miliseconds
}
}
}
jint EXPORT_ME
JNICALL Java_quake_two_android_Quake2_RegisterThis(JNIEnv * env, jobject clazz){
g_pActivity = (jobject)(*env)->NewGlobalRef(env, clazz);
return 0;
}
jint EXPORT_ME JNICALL
JNI_OnLoad(JavaVM * vm, void * reserved)
{
JNIEnv *env;
jVM = vm;
if((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK)
{
return -1;
}
const char* interface_path = "quake/two/android/Quake2";
jclass java_activity_class = (*env)->FindClass( env, interface_path );
javaOnNDKTouch = (*env)->GetMethodID( env, java_activity_class, "OnNativeMotion", "(IIIII)V");
javaOnNDKKey = (*env)->GetMethodID( env, java_activity_class, "OnNativeKeyPress", "(III)V");
return JNI_VERSION_1_4;
}
Licences:
/*
* Copyright (c) 2011, Sony Ericsson Mobile Communications AB.
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the Sony Ericsson Mobile Communications AB nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/
/*
* This example uses the NDK and a helper library available in the NDK called 'native app glue',
* which is available in %NDK_ROOT/source/android/native_app_glue. If you are new to NDK or to
* the NativeActivity, look through the native_app_glue source to see how you should set up your
* native app and handle callbacks and messages from Android. Note that the callbacks registered
* in the ANativeActivity_onCreate() entry-point must return in a timely manner, as does
* ANativeActivity_onCreate() itself. The Native App Glue does this by creating a pipe() and
* synchronization objects to handle communication between the Android, the NativeActivity and
* the game/sample logic.
*
* In this example, we read the 'pointer' information from touch events from both the touch-screen
* and the touch-pad (if available). We store their positions and state (up or down), then draw
* the touch positions scaled to the screen.
*
* Although we are using hard-coded values for the touch-pad resultion, you can and should read
* those values at runtime in java by enumerating InputDevices and finding the touchpad device.
*
*/
/*
* Kwaak3 - Java to quake3 interface
* Copyright (C) 2010 Roderick Colenbrander
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
Click to expand...
Click to collapse

ninjatjj said:
Hi,
I am trying to follow this a little, I can access the touchpad just fine in my nativeactivity but I have lost all access to onKeyDown and onKeyUp using the Sony tutorial. Your code mentions OnNativeKeyPress but I can't see you calling this anywhere, is this expected?
Thanks for any pointers you can offer!
Click to expand...
Click to collapse
If you're still searching:
Code:
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
JNIEnv *jni;
(*jVM)->AttachCurrentThread(jVM, &jni, NULL);
struct ENGINE* engine = (struct ENGINE*)app->userData;
if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
{
int nPointerCount = AMotionEvent_getPointerCount( event );
int nSourceId = AInputEvent_getSource( event );
int n;
jboolean newTouch = JNI_TRUE;
for( n = 0 ; n < nPointerCount ; ++n )
{
int nPointerId = AMotionEvent_getPointerId( event, n );
int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
int nRawAction = AMotionEvent_getAction( event );
struct TOUCHSTATE *touchstate = 0;
if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
touchstate = engine->touchstate_pad;
else
touchstate = engine->touchstate_screen;
if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
{
int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
}
if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
{
touchstate[nPointerId].down = 1;
}
else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
{
touchstate[nPointerId].down = 0;
}
if (touchstate[nPointerId].down == 1)
{
touchstate[nPointerId].x = AMotionEvent_getX( event, n );
touchstate[nPointerId].y = AMotionEvent_getY( event, n );
}
int handled = 0;
if( jni && g_pActivity ){
(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0, newTouch);
}
newTouch = JNI_FALSE;
}
return 1;
}
else if (AInputEvent_getType(event) == AINPUT_EVENT_TYPE_KEY){
int action = AKeyEvent_getAction(event);
int keyCode = AKeyEvent_getKeyCode(event);
if(jni && g_pActivity){
if((*jni)->ExceptionCheck(jni)) {
(*jni)->ExceptionDescribe(jni);
(*jni)->ExceptionClear(jni);
}
(*jni)->CallIntMethod(jni, g_pActivity, javaOnNDKKey, action, keyCode, AKeyEvent_getMetaState(event));
}
}
return 0;
}
taken from: https://play.google.com/store/apps/details?id=zeus.arena.source&hl=en&rdid=zeus.arena.source
Btw: The application should not be run. It doesn't work and will waste your time. Open the APK with a file browser and get the 7z out of the assets manually. There are plenty of places to copy that code from, but I guess it wasn't on Sony's developer page :/

Thanks for the code snippet, I managed to get this working already: github.com/ninjatjj/btjoypad

ninjatjj said:
Thanks for the code snippet, I managed to get this working already: github.com/ninjatjj/btjoypad
Click to expand...
Click to collapse
Alright. I just added support to reicast and stumbled on this trying to find the Sony page that moved. This is a very basic copy paste implementation. It should really explain the what and why if it's going to leave out so much.
I'll take a look at how you did it shortly and let you know if there's anything I did that might help. I rewrote a lot of the native to allow skipping over the input for everything but the Play.

twistedumbrella said:
Alright. I just added support to reicast and stumbled on this trying to find the Sony page that moved. This is a very basic copy paste implementation. It should really explain the what and why if it's going to leave out so much.
I'll take a look at how you did it shortly and let you know if there's anything I did that might help. I rewrote a lot of the native to allow skipping over the input for everything but the Play.
Click to expand...
Click to collapse
Much appreciated - I was actually checking out reicast yesterday but I don't have my MvC rom with me (on holiday) - are you getting decent performance on the xperia play? That is one device that definitely needs a hardware upgrade soon.

ninjatjj said:
Much appreciated - I was actually checking out reicast yesterday but I don't have my MvC rom with me (on holiday) - are you getting decent performance on the xperia play? That is one device that definitely needs a hardware upgrade soon.
Click to expand...
Click to collapse
I saw you're using the full native version so not much I did will have any benefit. I use the native just to pass back to the Java for ease of use now since everything else is Java-based.
It is starting to. It doesn't work on Gingerbread due to memory allocation issues with the build but I get the cutscenes at about 60 fps and gameplay is anywhere from 20 to 40 depending on how intense the scene is. That's what I've been working on, though. Every little option to squeeze another 10 or so out of it.

Related

[IDEA] Xbox360 Controller for Android

Possible to make the Xbox360 controller connect to your Android phone ?
there's alot of software for both Mac and Windows letting you connect it to your computer, so without too much research i wondered if there is a possibility to port one of those apps over to android and use the controller for emulators and games.
there must be some open-source apps for win/osx out there i think so if any dev would have a look at it, for me to read those codes would be the same as to try to teach me greek..
just an idea, but if there's anyone that could make something out of this i think there will be alot of people wanting to use it
kinda like the SNES controller the streak's got, only wireless instead.. haha
Xbox 360 controllers aren't bluetooth
Sent from my HTC Droid Incredible. This Droid Does More.
its not just software to allow an Xbox controller to be connected to a PC, unless you're talking about the wired version, you need the reciever.
Like generamerica says, Xbox controllers aren't BT. PS3 ones are, but the bluetooth it uses is slightly different than normal and if i recall there have been issues in the past trying to get it to connect, no idea where that project has gone since then.
The only controller that can work with android from the big 3 atm is the wiimote, and even then you have to be running the stack from AOSP instead of the sense stack.
You got a lot of Bluetooth receivers !
It's possible !
GéVé said:
You got a lot of Bluetooth receivers !
It's possible !
Click to expand...
Click to collapse
!!!!!!! lo voglioooo contropper per android...cosi gioco a game psx gba
why not ps3 controller?
Yeah 360 would be impracticle if not impossible as you would have to either use a wired controller or have a wireless adapter for it to work. All of which means more stuff to lug around with an already large controller.
It would be cool to get the ps3 controller working on android but the controllerr is fairly big to carry with your phone.
Btw, the ps3 controller works great on windows. I use it with my netbook to play games in between classes. It connects to my normal Bluetooth receiver without any issues using a custom driver and software to pair the devices.
The only problem I can see coming from getting a ps3 controller working on android would be that once you pair the controller to the receiver you can no longer use it with any other devices unless you unpaid and uninstall driver to original. This is how it is with the controller on windows and most people use 2 Bluetooth receivers to deal with the problem.
Regardless of how practicle it would be to carry a ps3 or Xbox controller around with your phone it would be sweet to have working especially to get some of the other bluetooth controllers that are out there working. I have seen Asus controllers that are Bluetooth and have all the same buttons as ps3 controllers but they fold up and are a lot smaller and easy to carry around.
And a Bluetooth controller would be great for us who do not have enough hardware keys to play games with - no hardware keyboard.
Sent from my HTC Hero CDMA using XDA App
Its hardly impractical, if you was chilling at work on a break and you and a friend wanted a game of street fighter on Tiger mame or something then i think linking two xbox pads to your phone and having a few round would be superb, I would avoid even bothering with wireless with the xbox pad because it's not bluetooth and the ps3 is problematic for most devices using sense (which means half of them) so it's just the wired xbox pad, a port or a fresh app which can link the onscreen buttons to games or emulators to the digital buttons would be fantastic.
Dev's would pick up on this and would probably develop around them if it became successful enough, I think it's a good idea, the reason i say that is because the onscreen keyboards for the devices out now have some achillies heel's on them, first one is the fact that it's touchscreen and it's just not the same as a pad, the second is there is only ever going to be one player per device the likes of street fighter 2 turbo and king of fighters were ment to be played with two players, not to mention the amount of other game types which would be played better on a pad other than touchscreen.
In an ideal world it would be a cable to your tv and a cable to your pad, run a game on your phone and play it on your tv with your xbox pad wired.
I know it can be done, might look into doing it myself, usb can be split too, analogue could come later, hmmmmm.
There is an actual implementation of Xbox360 controler for the Elocity A7 tablet,
[ROM] Emulator / XBox360 Controller Mod 1-21-2011
But idk if it could be adapted to other android devices since it seems to need an usb port support.
Sine. said:
There is an actual implementation of Xbox360 controler for the Elocity A7 tablet,
[ROM] Emulator / XBox360 Controller Mod 1-21-2011
But idk if it could be adapted to other android devices since it seems to need an usb port support.
Click to expand...
Click to collapse
Yeah, in order to use the wired controller, you would need USB Host mode drivers, which at the moment not a lot of Android phones have (yet). At least, not to my knowledge anyway. Then you would need some sort of adapter to go from Mini/Micro USB (depending on which phone you have) to regular USB.
Sine. said:
There is an actual implementation of Xbox360 controler for the Elocity A7 tablet,
[ROM] Emulator / XBox360 Controller Mod 1-21-2011
But idk if it could be adapted to other android devices since it seems to need an usb port support.
Click to expand...
Click to collapse
abrigham said:
Yeah, in order to use the wired controller, you would need USB Host mode drivers, which at the moment not a lot of Android phones have (yet). At least, not to my knowledge anyway. Then you would need some sort of adapter to go from Mini/Micro USB (depending on which phone you have) to regular USB.
Click to expand...
Click to collapse
well I've seen some articles where people have some sort of adapter that turnes mini/micro usb into a dual input for (mouse / keyboard) so I guess two controllers shouldn't be that much further off... it's a matter of who has the knowledge and can implement it... unfortunately, it isn't me =( heh
But for the devices with usb ports how would one go about getting the required support in their rom?
When you plug the 360 controller in nothing happens.. I would like to know what files I may need to push to my android device that HAS full usb ports.
sent from gv1.5 on g2
Get a micro usb to female usb cable, plug in the wireless adapter. Your device has to be running Android 4.0 to work, but it does work flawlessly. You then have to go to market and download a free app that allows you to configure the buttons on the controller....then you're all set. There are also a few games that are already set in options to run third party controllers as well. I'm planning on doing this as soon as we get some type on ICS love from any direction for the SGS2 SGH-T989, so. I hooked this setting up on a customers Galaxy Tab, so. Awesome...
This doesn't have to do with the xbox controller, but On my old itouch, I could use a wii controller to play games on a nes emulator. If it's possible with ios, then it has to be with android.
Sent from my GT-I9100 using Tapatalk
Well, maybe it's possible to make Android using pads already connected to PC? I have Xbox 360 pad with PC receiver and I am wondering if it's actually possible to connect that controller to Android device (mine is HTC Desire Z).
I'm afraid that there can be some input lag but maybe that can be done?
Plausible and an achievable concept... is it worth it? Lol?
This got me thinking.
I dont know why anyone would want to do this now a days but it is very possible.
Back in 2007 remotejoy came out on the psp-1001 Phat.
Basically the concept is simple you would put a couple of .prx files on the memory stick pro duo in the root directory in a folder named plugins. And set up a windows drivers and a small application I think. I'm not 100% about that but seems logical.
Then interface the psp with the pc via USB but before interfacing, you would have to go into the vsh menu which you would hit a button combo and a text overlay would appear over the xmb in the top left hand corner. You could then select what plugin you wanted to load. Then you would load up the prx's and plug in your controller and USB type B from the psp to the pc. It would detect it though libusb and then you simply execute remotejoy.exe "I think" don't quote me on any of this. It was a long time ago....
Then you would literally be able to play your game on your monitor or on a flat screen like me. Via dvi output from an ant 1600xt graphics card in crossfire. And my 10000 pound Sony professional 42 inch $4000 plasma..... and then thru remotejoy gui you could go to controllers and basically use any analog controller that there were drivers for...... Man things have really changed in 13 years!!!!.... Jeez.
These plugins could add all kinds of extra features. Fps counter, background music player, an ir yeah infrared.... lol irmanager a badass file explorer with every feature you could ever imagine, well there was an irshell .prx, and a psx one called pops before Sony ever even released it.....
Sound familiar? Should, first there was xposed framework, now we have magisk plugins with support for xposed modules....
With that being said this is very possible and a feasible feat. (Not that anyone should take it on).....
It was coded under unix. So same kernel we use on androids..... It could be forked/updated/shimed/referenced/converted/reverse engineer to remotejoy magisk plugin but it's old code.
I think the last time it was updated was in 09 when I graduated high school.... so someone way more talented then me could possibly modify and retrofit or port to android using adb and USB debugging on a rooted android.
But this is not my area of expertise ports aren't my bag, man.
It would be a huge undertaking.
For controller support it would use socks. TCP specifically.
I didn't dig into it too much because well everyone now a days has smart share or screen mirroring and bluetooth xbox one s controllers and 4k TV's......
Just wanted to say it it a very plausible and achievable project.
Sorry to dig up an old thread. I have the original source code somewhere.
Here's an excerpt for controller support and mapping via universal serial ports it's in raw.
/*
* PSPLINK
* -----------------------------------------------------------------------
* Licensed under the BSD license, see LICENSE in PSPLINK root for details.
*
* remotejoy.c - PSPLINK PC remote joystick handler (SDL Version)
*
* Copyright (c) 2006 James F <[email protected]>
*
* $HeadURL: svn://svn.pspdev.org/psp/branches/psplinkusb/tools/remotejoy/pcsdl/remotejoy.c $
* $Id: remotejoy.c 2187 2007-02-20 19:28:00Z tyranid $
*/
#include <stdio.h>
#include <unistd.h>
#include <stdlib.h>
#include <sys/select.h>
#include <sys/socket.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <netinet/in.h>
#include <netinet/tcp.h>
#include <netdb.h>
#include <limits.h>
#include <errno.h>
#include <ctype.h>
#include <signal.h>
#include <string.h>
#include <SDL.h>
#include <SDL_thread.h>
#include "../remotejoy.h"
#define DEFAULT_PORT 10004
#define DEFAULT_IP "localhost"
#define MAX_AXES_NUM 32767
#define DIGITAL_TOL 10000
#define PSP_SCREEN_W 480
#define PSP_SCREEN_H 272
#define EVENT_ENABLE_SCREEN 1
#define EVENT_RENDER_FRAME_1 2
#define EVENT_RENDER_FRAME_2 3
#define EVENT_DISABLE_SCREEN 4
#ifndef SOL_TCP
#define SOL_TCP IPPROTO_TCP
#endif
#if defined BUILD_BIGENDIAN || defined _BIG_ENDIAN
uint16_t swap16(uint16_t i)
{
uint8_t *p = (uint8_t *) &i;
uint16_t ret;
ret = (p[1] << 8) | p[0];
return ret;
}
uint32_t swap32(uint32_t i)
{
uint8_t *p = (uint8_t *) &i;
uint32_t ret;
ret = (p[3] << 24) | (p[2] << 16) | (p[1] << 8) | p[0];
return ret;
}
uint64_t swap64(uint64_t i)
{
uint8_t *p = (uint8_t *) &i;
uint64_t ret;
ret = (uint64_t) p[0] | ((uint64_t) p[1] << 8) | ((uint64_t) p[2] << 16) | ((uint64_t) p[3] << 24)
| ((uint64_t) p[4] << 32) | ((uint64_t) p[5] << 40) | ((uint64_t) p[6] << 48) | ((uint64_t) p[7] << 56);
return ret;
}
#define LE16(x) swap16(x)
#define LE32(x) swap32(x)
#define LE64(x) swap64(x)
#else
#define LE16(x) (x)
#define LE32(x) (x)
#define LE64(x) (x)
#endif
enum PspCtrlButtons
{
/** Select button. */
PSP_CTRL_SELECT = 0x000001,
/** Start button. */
PSP_CTRL_START = 0x000008,
/** Up D-Pad button. */
PSP_CTRL_UP = 0x000010,
/** Right D-Pad button. */
PSP_CTRL_RIGHT = 0x000020,
/** Down D-Pad button. */
PSP_CTRL_DOWN = 0x000040,
/** Left D-Pad button. */
PSP_CTRL_LEFT = 0x000080,
/** Left trigger. */
PSP_CTRL_LTRIGGER = 0x000100,
/** Right trigger. */
PSP_CTRL_RTRIGGER = 0x000200,
/** Triangle button. */
PSP_CTRL_TRIANGLE = 0x001000,
/** Circle button. */
PSP_CTRL_CIRCLE = 0x002000,
/** Cross button. */
PSP_CTRL_CROSS = 0x004000,
/** Square button. */
PSP_CTRL_SQUARE = 0x008000,
/** Home button. */
PSP_CTRL_HOME = 0x010000,
/** Music Note button. */
PSP_CTRL_NOTE = 0x800000,
/** Screen button. */
PSP_CTRL_SCREEN = 0x400000,
/** Volume up button. */
PSP_CTRL_VOLUP = 0x100000,
/** Volume down button. */
PSP_CTRL_VOLDOWN = 0x200000,
};
enum PspButtons
{
PSP_BUTTON_CROSS = 0,
PSP_BUTTON_CIRCLE = 1,
PSP_BUTTON_TRIANGLE = 2,
PSP_BUTTON_SQUARE = 3,
PSP_BUTTON_LTRIGGER = 4,
PSP_BUTTON_RTRIGGER = 5,
PSP_BUTTON_START = 6,
PSP_BUTTON_SELECT = 7,
PSP_BUTTON_UP = 8,
PSP_BUTTON_DOWN = 9,
PSP_BUTTON_LEFT = 10,
PSP_BUTTON_RIGHT = 11,
PSP_BUTTON_HOME = 12,
PSP_BUTTON_NOTE = 13,
PSP_BUTTON_SCREEN = 14,
PSP_BUTTON_VOLUP = 15,
PSP_BUTTON_VOLDOWN = 16,
PSP_BUTTON_MAX = 17
};
unsigned int g_bitmap[PSP_BUTTON_MAX] = {
PSP_CTRL_CROSS, PSP_CTRL_CIRCLE, PSP_CTRL_TRIANGLE, PSP_CTRL_SQUARE,
PSP_CTRL_LTRIGGER, PSP_CTRL_RTRIGGER, PSP_CTRL_START, PSP_CTRL_SELECT,
PSP_CTRL_UP, PSP_CTRL_DOWN, PSP_CTRL_LEFT, PSP_CTRL_RIGHT, PSP_CTRL_HOME,
PSP_CTRL_NOTE, PSP_CTRL_SCREEN, PSP_CTRL_VOLUP, PSP_CTRL_VOLDOWN
};
const char *map_names[PSP_BUTTON_MAX] = {
"cross", "circle", "triangle", "square",
"ltrig", "rtrig", "start", "select",
"up", "down", "left", "right", "home",
"note", "screen", "volup", "voldown"
};
/* Maps the buttons on the joystick to the buttons on the PSP controller */
unsigned int *g_buttmap = NULL;
struct Args
{
const char *ip;
unsigned short port;
const char *dev;
const char *mapfile;
int verbose;
int video;
int fullscreen;
int droprate;
int fullcolour;
int halfsize;
int showfps;
};
struct GlobalContext
{
struct Args args;
struct sockaddr_in serv;
char name[128];
unsigned int version;
unsigned char axes;
unsigned char buttons;
int exit;
int digital;
int analog;
int tol;
int scron;
};
struct GlobalContext g_context;
struct ScreenBuffer
{
unsigned char buf[PSP_SCREEN_W * PSP_SCREEN_H * 4];
struct JoyScrHeader head;
/* Mutex? */
};
static struct ScreenBuffer g_buffers[2];
void init_font(void);
void print_text(SDL_Surface *screen, int x, int y, const char *fmt, ...);
/* Should have a mutex on each screen */
#define VERBOSE (g_context.args.verbose)
int fixed_write(int s, const void *buf, int len)
{
int written = 0;
while(written < len)
{
int ret;
ret = write(s, buf+written, len-written);
if(ret < 0)
{
if(errno != EINTR)
{
perror("write");
written = -1;
break;
}
}
else
{
written += ret;
}
}
return written;
}
int parse_args(int argc, char **argv, struct Args *args)
{
memset(args, 0, sizeof(*args));
args->ip = DEFAULT_IP;
args->port = DEFAULT_PORT;
while(1)
{
int ch;
int error = 0;
ch = getopt(argc, argv, "vsfchldp:i:m:r:");
if(ch < 0)
{
break;
}
switch(ch)
{
case 'p': args->port = atoi(optarg);
break;
case 'i': args->ip = optarg;
break;
case 'm': args->mapfile = optarg;
break;
case 'v': args->verbose = 1;
break;
case 'd': args->video = 1;
break;
case 'f': args->fullscreen = 1;
break;
case 'c': args->fullcolour = 1;
break;
case 'l': args->halfsize = 1;
break;
case 's': args->showfps = 1;
break;
case 'r': args->droprate = atoi(optarg);
if((args->droprate < 0) || (args->droprate > 59))
{
fprintf(stderr, "Invalid drop rate (0 <= r < 60)\n");
error = 1;
}
break;
case 'h':
default : error = 1;
break;
};
if(error)
{
return 0;
}
}
argc -= optind;
argv += optind;
return 1;
}
void print_help(void)
{
fprintf(stderr, "Remotejoy Help\n");
fprintf(stderr, "Usage: remotejoy [options]\n");
fprintf(stderr, "Options:\n");
fprintf(stderr, "-p port : Specify the port number\n");
fprintf(stderr, "-i ip : Specify the ip address (default %s)\n", DEFAULT_IP);
fprintf(stderr, "-m mapfile : Specify a file to map joystick buttons to the PSP\n");
fprintf(stderr, "-d : Auto enable display support\n");
fprintf(stderr, "-f : Full screen mode\n");
fprintf(stderr, "-r drop : Frame Skip, 0 (auto), 1 (1/2), 2 (1/3), 3(1/4) etc.\n");
fprintf(stderr, "-c : Full colour mode\n");
fprintf(stderr, "-l : Half size mode (both X and Y)\n");
fprintf(stderr, "-s : Show fps\n");
fprintf(stderr, "-v : Verbose mode\n");
}
int init_sockaddr(struct sockaddr_in *name, const char *ipaddr, unsigned short port)
{
struct hostent *hostinfo;
name->sin_family = AF_INET;
name->sin_port = htons(port);
hostinfo = gethostbyname(ipaddr);
if(hostinfo == NULL)
{
fprintf(stderr, "Unknown host %s\n", ipaddr);
return 0;
}
name->sin_addr = *(struct in_addr *) hostinfo->h_addr;
return 1;
}
int connect_to(const char *ipaddr, unsigned short port)
{
struct sockaddr_in name;
int sock = -1;
int flag = 1;
sock = socket(PF_INET, SOCK_STREAM, 0);
if(sock < 0)
{
perror("socket");
return -1;
}
if(!init_sockaddr(&name, ipaddr, port))
{
printf("Could not initialise socket address\n");
close(sock);
return -1;
}
if(connect(sock, (struct sockaddr *) &name, sizeof(name)) < 0)
{
perror("connect");
close(sock);
return -1;
}
/* Disable NAGLE's algorithm to prevent the packets being joined */
setsockopt(sock, SOL_TCP, TCP_NODELAY, &flag, sizeof(int));
return sock;
}
int get_joyinfo(SDL_Joystick *stick)
{
const char *name;
name = SDL_JoystickName(0);
if(!name)
{
return 0;
}
strcpy(g_context.name, name);
g_context.axes = SDL_JoystickNumAxes(stick);
g_context.buttons = SDL_JoystickNumButtons(stick);
return 1;
}
void remove_wsp(char *buf)
{
int len = strlen(buf);
int i = 0;
while(isspace(buf))
{
i++;
}
if(i > 0)
{
len -= i;
memmove(buf, &buf, len + 1);
}
if(len <= 0)
{
return;
}
i = len-1;
while(isspace(buf))
{
buf[i--] = 0;
}
}
int build_map(const char *mapfile, int buttons)
{
int i;
FILE *fp;
g_context.analog = -1;
g_context.digital = -1;
g_context.tol = DIGITAL_TOL;
g_buttmap = (unsigned int *) malloc(buttons * sizeof(unsigned int));
if(g_buttmap == NULL)
{
return 0;
}
for(i = 0; i < buttons; i++)
{
/* Fill with mappings, repeat if more than 8 buttons */
g_buttmap = i % 8;
}
if(mapfile)
{
char buffer[512];
int line = 0;
fp = fopen(mapfile, "r");
if(fp == NULL)
{
fprintf(stderr, "Could not open mapfile %s\n", mapfile);
return 0;
}
while(fgets(buffer, sizeof(buffer), fp))
{
char *tok, *val;
int butt;
line++;
remove_wsp(buffer);
if((buffer[0] == '#') || (buffer[0] == 0)) /* Comment or empty line */
{
continue;
}
tok = strtok(buffer, ":");
val = strtok(NULL, "");
if((tok == NULL) || (val == NULL))
{
printf("Invalid mapping on line %d\n", line);
continue;
}
butt = atoi(val);
for(i = 0; i < PSP_BUTTON_MAX; i++)
{
if(strcasecmp(map_names, tok) == 0)
{
g_buttmap[butt] = i;
break;
}
}
if(i == PSP_BUTTON_MAX)
{
if(strcasecmp("analog", tok) == 0)
{
g_context.analog = butt;
}
else if(strcasecmp("digital", tok) == 0)
{
g_context.digital = butt;
}
else if(strcasecmp("tol", tok) == 0)
{
g_context.tol = atoi(val);
}
else
{
fprintf(stderr, "Unknown map type %s\n", tok);
}
}
}
fclose(fp);
}
return 1;
}
int send_event(int sock, int type, unsigned int value)
{
struct JoyEvent event;
if(sock < 0)
{
return 0;
}
/* Note, should swap endian */
event.magic = LE32(JOY_MAGIC);
event.type = LE32(type);
event.value = LE32(value);
if(fixed_write(sock, &event, sizeof(event)) != sizeof(event))
{
fprintf(stderr, "Could not write out data to socket\n");
return 0;
}
return 1;
}
void post_event(int no)
{
SDL_Event event;
event.type = SDL_USEREVENT;
event.user.code = no;
event.user.data1 = NULL;
event.user.data2 = NULL;
SDL_PushEvent(&event);
}
int flush_socket(int sock)
{
/* If we encounter some horrible error which means we are desynced
* then send a video off packet to remotejoy, wait around for a second sucking up
* any more data from the socket and then reenable */
return 0;
}
void update_fps(SDL_Surface *screen)
{
#define FRAME_VALUES 32
static Uint32 times[FRAME_VALUES];
static Uint32 lastticks = 0;
static int index = 0;
Uint32 ticks;
int i;
double fps;
ticks = SDL_GetTicks();
times[index] = ticks - lastticks;
index = (index + 1) % FRAME_VALUES;
lastticks = ticks;
fps = 0.0;
for(i = 0; i < FRAME_VALUES; i++)
{
fps += (double) times;
}
fps /= (double) FRAME_VALUES;
/* Fps is now average frame time */
fps = 1000.0 / fps;
/* Now frame frequency in Hz */
print_text(screen, 0, 0, "Fps: %.2f", fps);
}
int read_thread(void *p)
{
int err = 0;
int frame = 0;
fd_set saveset, readset;
int count;
int sock = *(int *) p;
struct JoyScrHeader head;
FD_ZERO(&saveset);
FD_SET(sock, &saveset);
while(!err)
{
readset = saveset;
count = select(FD_SETSIZE, &readset, NULL, NULL, NULL);
if(count > 0)
{
int ret;
int mode;
int size;
if(FD_ISSET(sock, &readset))
{
ret = read(sock, &head, sizeof(head));
if((ret != sizeof(head)) || (LE32(head.magic) != JOY_MAGIC))
{
fprintf(stderr, "Error in socket %d, magic %08X\n", ret, head.magic);
flush_socket(sock);
break;
}
mode = LE32(head.mode);
size = LE32(head.size);
g_buffers[frame].head.mode = mode;
g_buffers[frame].head.size = size;
if(mode < 0)
{
if(g_context.args.video)
{
post_event(EVENT_ENABLE_SCREEN);
}
else
{
g_context.scron = 0;
}
}
else if(mode > 3)
{
/* Flush socket */
flush_socket(sock);
}
else
{
/* Try and read in screen */
/* If we do not get a full frame read and we timeout in quater second or so then
* reset sync as it probably means the rest isn't coming */
int loc = 0;
//fprintf(stderr, "Size %d\n", size);
while(1)
{
readset = saveset;
/* Should have a time out */
count = select(FD_SETSIZE, &readset, NULL, NULL, NULL);
if(count > 0)
{
ret = read(sock, &(g_buffers[frame].buf[loc]), size-loc);
if(ret < 0)
{
if(errno != EINTR)
{
perror("read:");
err = 1;
break;
}
}
else if(ret == 0)
{
fprintf(stderr, "EOF\n");
break;
}
//fprintf(stderr, "Read %d\n", loc);
loc += ret;
if(loc == size)
{
break;
}
}
else if(count < 0)
{
if(errno != EINTR)
{
perror("select:");
err = 1;
break;
}
}
}
if(!err)
{
if(frame)
{
post_event(EVENT_RENDER_FRAME_2);
}
else
{
post_event(EVENT_RENDER_FRAME_1);
}
frame ^= 1;
}
}
}
}
else if(count < 0)
{
if(errno != EINTR)
{
perror("select:");
err = 1;
}
}
}
return 0;
}
SDL_Surface *create_surface(void *buf, int mode)
{
unsigned int rmask, bmask, gmask, amask;
int currw, currh;
int bpp;
currw = PSP_SCREEN_W;
currh = PSP_SCREEN_H;
if(g_context.args.halfsize)
{
currw >>= 1;
currh >>= 1;
}
if(VERBOSE)
{
printf("Mode %d\n", mode);
}
switch(mode)
{
case 3:
rmask = LE32(0x000000FF);
gmask = LE32(0x0000FF00);
bmask = LE32(0x00FF0000);
amask = 0;
bpp = 32;
break;
case 2:
rmask = LE16(0x000F);
gmask = LE16(0x00F0);
bmask = LE16(0x0F00);
amask = 0;
bpp = 16;
break;
case 1:
rmask = LE16(0x1F);
gmask = LE16(0x1F << 5);
bmask = LE16(0x1F << 10);
amask = 0;
bpp = 16;
break;
case 0:
rmask = LE16(0x1F);
gmask = LE16(0x3F << 5);
bmask = LE16(0x1F << 11);
amask = 0;
bpp = 16;
break;
default: return NULL;
};
return SDL_CreateRGBSurfaceFrom(buf, currw, currh, bpp, currw*(bpp/8),
rmask, gmask, bmask, amask);
}
void save_screenshot(SDL_Surface *surface)
{
int i;
char path[PATH_MAX];
struct stat s;
/* If we cant find one in the next 1000 then dont bother */
for(i = 0; i < 1000; i++)
{
snprintf(path, PATH_MAX, "scrshot%03d.bmp", i);
if(stat(path, &s) < 0)
{
break;
}
}
if(i == 1000)
{
return;
}
if(SDL_SaveBMP(surface, path) == 0)
{
printf("Saved screenshot to %s\n", path);
}
else
{
printf("Error saving screenshot\n");
}
}
void mainloop(void)
{
SDL_Joystick *stick = NULL;
SDL_Surface *screen = NULL;
SDL_Surface *buf1 = NULL;
SDL_Surface *buf2 = NULL;
SDL_Thread *thread = NULL;
int currw, currh;
int sdl_init = 0;
int sock = -1;
unsigned int button_state = 0;
int currmode[2] = { 3, 3 };
int flags = SDL_HWSURFACE;
int pspflags = 0;
int showfps = 0;
Hit my limit deleted non essentials to controller mapping and left some display transmission info in.
But no one will probably even read this... I LOVE XDADEVELOPERS!

[TOOL][APP][WIP] Native AT Command Injector

I trying to develop an application that can send AT commands directly to modem, from a native application interface (or possibly via terminal).
There are many obstacles here, but the main one is that the modem AT interface is rarely available through a simple serial device, which is why all other AT command apps always fail for the Samsung Galaxy class devices, which use a modified IPC protocol over sockets. However, by using the built-in RIL_OEM_HOOK_RAW and RIL_OEM_HOOK_STR we should be able to do this.
I then found this document for the Ublox LISA-U2 UMTS/HSPA voice and data modules, which show an example Android App, including most of the code used in that App. However, I cannot get this to compile in Eclipse. I think the reason is that there are many cosmetic changes in the AOS, while there are many typos in the code in that document.
I would very much like to get this or something very similar to work. I'm mainly developing on the Samsung Galaxy S2 (GB).
Here is a picture (from that document).
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Any help with this would be very very much appreciated!
It will also benefit the rest of the community as it will provide a foundation
for my future development of many other things to come, for free!
BTW. The BP used in that device is the very popular Intel/Infineon XGOLD-626...
When trying to run the code shown below in Eclipse (using GB 2.3.3+),
it fails with the following errors:The import com.android.internal.telephony cannot be resolved
The import android.os.AsyncResult cannot be resolved​It seem like it doesn't recognize these imports:
Code:
import com.android.internal.telephony.Phone;
import com.android.internal.telephony.PhoneFactory;
import android.os.AsyncResult;
The java code is:
Code:
[SIZE=2]
/*=========================================================
Demo App Code by Ublox, modified copy and paste from:
http://www.u-blox.com/images/downloads/Product_Docs/AndroidRIL_Source_Code_ApplicationNote_%283G.G2-CS-11003%29.pdf
=========================================================== */
package com.testapp.sat;
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.widget.EditText;
import android.widget.RadioButton;
import android.widget.RadioGroup;
import android.widget.Toast;
import com.android.internal.telephony.Phone;
import com.android.internal.telephony.PhoneFactory;
import android.view.View.OnKeyListener;
import android.view.KeyEvent;
import android.os.Message;
import android.os.Handler;
import android.os.AsyncResult;
import android.util.Log;
import android.app.AlertDialog;
public class RilOemHookTest extends Activity
{
private static final String LOG_TAG = "RILOemHookTestApp";
private RadioButton mRadioButtonAPI1 = null;
private RadioGroup mRadioGroupAPI = null;
private Phone mPhone = null;
private EditText CmdRespText = null;
private static final int EVENT_RIL_OEM_HOOK_CMDRAW_COMPLETE = 1300;
private static final int EVENT_RIL_OEM_HOOK_CMDSTR_COMPLETE = 1400;
private static final int EVENT_UNSOL_RIL_OEM_HOOK_RAW = 500;
private static final int EVENT_UNSOL_RIL_OEM_HOOK_STR = 600;
@Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.riloemhook_layout);
mRadioButtonAPI1 = (RadioButton) findViewById(R.id.radio_api1);
mRadioGroupAPI = (RadioGroup) findViewById(R.id.radio_group_api);
// Initially turn on first button.
mRadioButtonAPI1.toggle();
//Get our main phone object.
// mPhone = PhoneFactory.getDefaultPhone();
//Register for OEM raw notification.
// mPhone.mCM.setOnUnsolOemHookRaw(mHandler, EVENT_UNSOL_RIL_OEM_HOOK_RAW, null);
//Capture text edit key press
CmdRespText = (EditText) findViewById(R.id.edit_cmdstr);
CmdRespText.setOnKeyListener(new OnKeyListener() {
public boolean onKey(View v, int keyCode, KeyEvent event) {
//If the event is a key-down event on the "enter" button
if ((event.getAction() == KeyEvent.ACTION_DOWN) && (keyCode == KeyEvent.KEYCODE_ENTER)) {
//Perform action on key press
Toast.makeText(RilOemHookTest.this,
CmdRespText.getText(), Toast.LENGTH_SHORT).show();
return true;
}
return false;
}
});
}
@Override
public void onPause()
{
super.onPause();
log("onPause()");
//Unregister for OEM raw notification.
// mPhone.mCM.unSetOnUnsolOemHookRaw(mHandler);
}
@Override
public void onResume()
{
super.onResume();
log("onResume()");
//Register for OEM raw notification.
// mPhone.mCM.setOnUnsolOemHookRaw(mHandler, EVENT_UNSOL_RIL_OEM_HOOK_RAW, null);
}
public void onRun(View view)
{
//Get the checked button
int idButtonChecked = mRadioGroupAPI.getCheckedRadioButtonId();
//Get the response field
CmdRespText = (EditText) findViewById(R.id.edit_response);
byte[] oemhook = null;
switch(idButtonChecked)
{
case R.id.radio_api1:
oemhook = new byte[1];
oemhook[0] = (byte)0xAA;
break;
case R.id.radio_api2:
oemhook = new byte[2];
oemhook[0] = (byte)0xBB;
oemhook[1] = (byte)0x55;
break;
case R.id.radio_api3:
//Send OEM notification (just echo the data bytes)
oemhook = new byte[7];
oemhook[0] = (byte)0xCC;
oemhook[1] = (byte)0x12;
oemhook[2] = (byte)0x34;
oemhook[3] = (byte)0x56;
oemhook[4] = (byte)0x78;
oemhook[5] = (byte)0x9A;
oemhook[6] = (byte)0xBC;
break;
case R.id.radio_api4:
//Send OEM command string
break;
default:
log("unknown button selected");
break;
}
if (idButtonChecked!=R.id.radio_api4) {
Message msg =
mHandler.obtainMessage(EVENT_RIL_OEM_HOOK_CMDRAW_COMPLETE);
mPhone.invokeOemRilRequestRaw(oemhook, msg);
CmdRespText.setText("");
} else {
//Copy string from EditText and add carriage return
String[] oemhookstring = { ((EditText)
findViewById(R.id.edit_cmdstr)).getText().toString()+'\r' } ;
//Create message
Message msg =
mHandler.obtainMessage(EVENT_RIL_OEM_HOOK_CMDSTR_COMPLETE);
//Send request
mPhone.invokeOemRilRequestStrings(oemhookstring, msg);
CmdRespText = (EditText) findViewById(R.id.edit_response);
CmdRespText.setText("---Wait response---");
}
}
private void logRilOemHookResponse(AsyncResult ar) {
log("received oem hook response");
String str = new String("");
if (ar.exception != null) {
log("Exception:" + ar.exception);
str += "Exception:" + ar.exception + "\n\n";
}
if (ar.result != null)
{
byte[] oemResponse = (byte[])ar.result;
int size = oemResponse.length;
log("oemResponse length=[" + Integer.toString(size) + "]");
str += "oemResponse length=[" + Integer.toString(size) + "]" + "\n";
if (size > 0) {
for (int i=0; i<size; i++) {
byte myByte = oemResponse[i];
int myInt = (int)(myByte & 0xFF);
log("oemResponse[" + Integer.toString(i) + "]=[0x" +
Integer.toString(myInt,16) + "]");
str += "oemResponse[" + Integer.toString(i) + "]=[0x" +
Integer.toString(myInt,16) + "]" + "\n";
}
}
} else {
log("received NULL oem hook response");
str += "received NULL oem hook response";
}
// Display message box
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setMessage(str);
builder.setPositiveButton("OK", null);
AlertDialog alert = builder.create();
alert.show();
}
private void logRilOemHookResponseString(AsyncResult ar) {
log("received oem hook string response");
String str = new String("");
CmdRespText = (EditText) findViewById(R.id.edit_response);
if (ar.exception != null) {
log("Exception:" + ar.exception);
str += "Exception:" + ar.exception + "\n\n";
}
if (ar.result != null) {
String[] oemStrResponse = (String[])ar.result;
int sizeStr = oemStrResponse.length;
log("oemResponseString[0] [" + oemStrResponse[0] + "]");
CmdRespText.setText( "" + oemStrResponse[0] );
} else {
log("received NULL oem hook response");
CmdRespText.setText( "No response or error received" );
}
}
private void log(String msg) {
Log.d(LOG_TAG, "[RIL_HOOK_OEM_TESTAPP] " + msg);
}
private Handler mHandler = new Handler() {
public void handleMessage(Message msg) {
AsyncResult ar;
switch (msg.what) {
case EVENT_RIL_OEM_HOOK_CMDRAW_COMPLETE:
log("EVENT_RIL_OEM_HOOK_CMDRAW_COMPLETE");
ar = (AsyncResult) msg.obj;
logRilOemHookResponse(ar);
break;
case EVENT_RIL_OEM_HOOK_CMDSTR_COMPLETE:
log("EVENT_RIL_OEM_HOOK_CMDSTR_COMPLETE");
ar = (AsyncResult) msg.obj;
logRilOemHookResponseString(ar);
break;
case EVENT_UNSOL_RIL_OEM_HOOK_RAW:
break;
case EVENT_UNSOL_RIL_OEM_HOOK_STR:
break;
}
}
};
}
[/SIZE]
------------------ EDIT ---------------------
Of course they aren't recognized! They are internal packages not available outside the standard AOS API's. But I'm very stubborn and have now spent 2 days trying to use those damn packages anyway! I managed! I will post an update here and a whole new thread on how that is done, once I can get some other bugs out of the way...
new thread posted?
Here is my new thread on how to use and import internal packages into your project,
including instructions for hacking Eclipse ADT plugin, in order to allow for using
the com.android.internal classes.
"[APP][DEV][GUIDE] Using the Android Java Internal/Hidden API classes"
It is work in progress (WIP), and include a lot of manual file manipulations.
As such it is closely related to this thread. So keep an eye open for updates.
EDIT: 2014-01-06
That thread still work for API-17 JAR's!!
So, I managed to get it to compile but not without two main issues.
0) I got it to run on an SGS2 running GB2.3.4, but it FC's after being run.
1) I kept on getting AlertDialog builder complaints like:
"the constructor AlertDialog.Builder is undefined" so I had to comment out all code related to that. I searched for fixes, but since I'm not a Java programmer, I could not resolve this in the proper way...
2) Then I ran into some other undocumented errors from my own built (according to instructions) android.jar, so I just used the inazaruk's pre-made one, and it compiled. Here are his files:
https://github.com/inazaruk/android-sdk/tree/master/platforms
The main point is that we should not expect to blindly use the Ublox code and think it will work. Instead we need to understand the following:
a) How to properly use the RIL constants
Code:
[SIZE=2]RILConstants.java:
int RIL_REQUEST_OEM_HOOK_RAW = 59;
int RIL_REQUEST_OEM_HOOK_STRINGS = 60;
int RIL_UNSOL_OEM_HOOK_RAW = 1028;
my code:
EVENT_RIL_OEM_HOOK_CMDRAW_COMPLETE
EVENT_RIL_OEM_HOOK_CMDSTR_COMPLETE
EVENT_UNSOL_RIL_OEM_HOOK_RAW
EVENT_UNSOL_RIL_OEM_HOOK_STR
[/SIZE]
b) What the various bytecodes sent, are actually doing, and how to make them do what we want.
Code:
[SIZE=2] case R.id.radio_api1:
oemhook = new byte[1];
oemhook[0] = (byte)0xAA;
break;
case R.id.radio_api2:
oemhook = new byte[2];
oemhook[0] = (byte)0xBB;
oemhook[1] = (byte)0x55;
break;
case R.id.radio_api3:
//Send OEM notification (just echo the data bytes)
oemhook = new byte[7];
oemhook[0] = (byte)0xCC;
oemhook[1] = (byte)0x12;
oemhook[2] = (byte)0x34;
oemhook[3] = (byte)0x56;
oemhook[4] = (byte)0x78;
oemhook[5] = (byte)0x9A;
oemhook[6] = (byte)0xBC;
break;
[/SIZE]
c) If we can simplify the code to just send ONE hard-coded AT command and read the response.
d) alternatively use a completely different method, that will undoubtedly work, but will be device dependent. I'm talking about the (Samsung modified) IPC modem-communication protocols, as discussed in my older threads...
e) Find out how to use a local shell + device to send AT commands directly to BP.
To monitor the radio and related messages from the App (on a Sumsung) , you can use:
Code:
[SIZE=2]adb shell logcat AndroidRuntime:* ActivityManager:* dalvikvm:* DataRouter:E NetdConnector:D *:s
[/SIZE]
I've done similar work in my SprintDiagnostics app included in our Eos project. Its in the sprint gnex thread. There's no way to make this a user app as you have to be on the Phone looper thread. Meaning you have to declare the phone process in the manifest for the activity calling the raw ril requests. I can grab msl, write prl, and some other good stuff. And enable diagnostic mode based on shell work from Autoprime.
Edit: http://git.teameos.org/eos/device/samsung/toroplus.git/tree/SprintDiagnostics?h=jellybean
bigrushdog said:
I've done similar work in my SprintDiagnostics app included in our Eos project. Its in the sprint gnex thread. There's no way to make this a user app as you have to be on the Phone looper thread. Meaning you have to declare the phone process in the manifest for the activity calling the raw ril requests. I can grab msl, write prl, and some other good stuff. And enable diagnostic mode based on shell work from Autoprime.
Edit: http://git.teameos.org/eos/device/samsung/toroplus.git/tree/SprintDiagnostics?h=jellybean
Click to expand...
Click to collapse
Hi! Thanks for interesting info, but unfortunately I don't understand all you're saying. But I got many more questions that answers.
First of all, what thread are you referring to? (Could you give a link?) Second, how do you do that declaration? (Code example?) BTW. I looked briefly at the code and must admit it seem a bit cryptic, as I didn't quite find the where AT's goes or are created. Third, what is "msl"? Do you have an APK to try on? Finally, what specific modem (cellular) processor are you running in that device? LTE?
Screenshots?
Thanks in advance.
Hi There
Forgive me if I'm oversimplifying it here but wouldn't minicom or some similar terminal program do this? I believe minicom is included in busybox.
Surely you could just wrap that around using JNI
AT Commands in android are sent natively by the vendor implemented ril library, normally set in global properties as rild.libpath. It may be worth stepping away from the JAVA code for a moment and looking at the rild and reference-ril sources in the aosp source. The atchannel.c is where the "magic" happens. Also If you haven't already, read PDK Telephony Documentation
When you get down to it, all you really want to do is read and write from a tty character device so doing it "raw" is always an option.
Also to monitor the Radio log in android use adb logcat -b radio , You will be enlightened as you watch the AT commands fly by!
trevd said:
Forgive me if I'm oversimplifying it here but ...
Also to monitor the Radio log in android use adb logcat -b radio , You will be enlightened as you watch the AT commands fly by!
Click to expand...
Click to collapse
No, you are missing the entire point of this thread. We are trying to circumvent actually using the vendor RIL, by tunneling the commands via the OEM_RAW requests. That is because it is often filtering out non-standard OEM AT commands that are not part of the common standard. These are all dependent on the BP/CP and and varies widely from device to device. In addition on many devices the direct modem serial is either not possible (because there is no such connection/transport) or it has been blocked/disabled. The Java way of circumventing this, is what this thread is all about. In addition the "-b radio" logcat doesn't fetch ATs sent by vendor specific OEM IPCs, for example. This is the case for the SGS2 (GT-I9100) and many other devices.
E:V:A said:
No, you are missing the entire point of this thread.
Click to expand...
Click to collapse
Ah, Okay...If I had a SGS2 too play with I'd probably join in on this little adventure, but I don't
trevd said:
Ah, Okay...If I had a SGS2 too play with I'd probably join in on this little adventure, but I don't
Click to expand...
Click to collapse
Well again... The point is that: RIL_OEM_HOOK_RAW and RIL_OEM_HOOK_STR are (internally) accessible for all RIL's. Which means that you CAN join in!
E:V:A said:
Hi! Thanks for interesting info, but unfortunately I don't understand all you're saying. But I got many more questions that answers.
First of all, what thread are you referring to? (Could you give a link?) Second, how do you do that declaration? (Code example?) BTW. I looked briefly at the code and must admit it seem a bit cryptic, as I didn't quite find the where AT's goes or are created. Third, what is "msl"? Do you have an APK to try on? Finally, what specific modem (cellular) processor are you running in that device? LTE?
Screenshots?
Thanks in advance.
Click to expand...
Click to collapse
My apologies for not responding sooner. I forgot to subscribe ;( . I'd like to try and clear up my previous post as well as try to shed some light on this topic. First, I'm by no means a telephony expert. I've read some of your other threads and found them remarkable. I write Java and do feature development in a AOSP project. I currently am doing telephony work on Samsun Sprint CDMA devices including galaxy nexus, nexus s, and working on s3. As you know, those use the VIA chipset. However, working through the Android radio layer makes that irrelevant, for the most part. I've also worked with Autoprime on some telephony/modem/ril matters.
First some code and background on what I have done on Sprint Galaxy Nexus. These are some code fragments from my CDMATools app.
Manifest:
Code:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="org.teameos.settings.device"
android:sharedUserId="android.uid.system"
android:versionCode="1"
android:versionName="1.0" >
Of importance if the android.uid.system uid. This allows the app to access the phone process, which is required for PhoneFactory.getDefaultPhone(); The following activity queries ril for a Master Subsidy Lock code (msl)
Code:
<activity
android:name=".MslActivity"
android:label="@string/msl_activity"
android:process="com.android.phone" >
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.DEFAULT" />
</intent-filter>
</activity>
And the actual method which is used to query the radio...
Code:
private void checkMSLCode()
{
ByteArrayOutputStream bytearrayoutputstream;
DataOutputStream dataoutputstream;
bytearrayoutputstream = new ByteArrayOutputStream();
dataoutputstream = new DataOutputStream(bytearrayoutputstream);
try {
dataoutputstream.writeByte(main_cmd_hidden);
dataoutputstream.writeByte(OEM_SUB_CMD_GET_MSL);
dataoutputstream.writeShort(device_short);
if (isTuna) {
dataoutputstream.writeByte(OEM_SUB_RAW_MSL);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mPhone.invokeOemRilRequestRaw(bytearrayoutputstream.toByteArray(),
mHandler.obtainMessage(GET_MSL_DONE));
}
A snipit of the handler code to receive and process incoming messages from radio.
Code:
mHandler = new Handler() {
public void handleMessage(Message message) {
Log.i(LOG_TAG, "MSL response incoming!!");
AsyncResult asyncresult = (AsyncResult) message.obj;
if (asyncresult.exception != null)
{
Log.i(LOG_TAG, "AsyncResult Exception Occur!!!");
} else
if (asyncresult.result == null)
{
Log.i(LOG_TAG, "ar.result == NULL! - No answer for MSL response");
} else
{
byte abyte0[] = (byte[]) (byte[]) asyncresult.result;
String s = new String("");
for (int j = 0; j < 6; j++)
s = (new StringBuilder()).append(s).append((char) abyte0[j]).toString();
Intent intent = new Intent().setAction(getString(R.string.msl_action));
intent.putExtra(getString(R.string.msl_key), Integer.parseInt(s));
mContext.sendBroadcast(intent);
finish();
}
}
};
I have been able to get the raw commands from RE'ing various device proprietary apk's. Of importance to this topic is this checkMSLCode() function. This is the equivalent of raw AT commands, but being passed though the Android radio layer. From looking at the code for that AT command application you posted in the OP, it uses a similar structure. In fact, if time permits, i'm confident i can expand on it and make it somewhat more robust. One problem is see is finding an effective way to interpret the AT command responses from the radio. Requests often get a response in different forms, thus requiring a unique handler for each function. Of course, we could always do a Object.toString() to the log and try to decipher it. I hope this info sheds some light on the topic. I'll post when I come up with more.
Edit: also I should note, i build this apk inside of my AOSP repo, so it sees all the internal api automatically. I'm now looking at using reflection to do some telephony stuff, but it's proving rather tricky.
Very nice! (That was an understatement!)
I just wish I knew how to apply that...
Regarding the raw command (and their appropriate responses), I think there is both code and documents available for this, for Qualcomm chips. I know nothing about the VIA chipsets, but a whole lot more about the Qualcomm MSM stuff... Many CDMA devices use just Qualcomm, so perhaps you've been across some of these. I bet we could get this working on them, somehow.
Since, just as you said, the RIL should take care of much of the details. We just need to apply the command requests and responses. But my Java App making skills are really not even worthy a try at the moment.
Somebody else who'd care to join us, to give this a try?
As an aside and a note of reference, mostly to myself. (For a professional OEM programmer, this may be obvious, but me it's not. ) After having dug through some of the QC MSM code, I've come to the conclusion that QC like to use what they (?) call SMD (Shared memory Device) for doing their IPC stuff, which include talking to modem. While for Infineon/Intel modem, we know that Samsung have developed their own (non-standard) IPC protocol version for this communication...
While rewriting my Cdmatools app making it fragment based, I used reflection to access all the hidden apis. I'll fork the source for the at command app and apply the reflected classes.
I also got access to my Samsung Galaxy S3 via Reflection... although I had to go AOSP and sign with they same system key to run as looper. But now I just need help with what to send........ via RAW/String
bigrushdog said:
While rewriting my Cdmatools app making it fragment based, I used reflection to access all the hidden apis. I'll fork the source for the at command app and apply the reflected classes.
Click to expand...
Click to collapse
Excellent. Where can we download this?
Have you been able to make demo App or something?
enigma99a said:
I also got access to my Samsung Galaxy S3 via Reflection... although I had to go AOSP and sign with they same system key to run as looper. But now I just need help with what to send........ via RAW/String
Click to expand...
Click to collapse
Can you elaborate? Do you need info on what AT commands to send, or are you asking about the protocol?
PS. I would be great if you could say something more about how you did it, so that perhaps other interested people can join in. We need more people on this, so that we can start working on THIS project.
my apologies that i've been unable to continue efforts on that project. My aosp project consumes all of my development time and then some. Here is the source for CDMATools. I was looking for ways to monetize it. However, due to android's permissions scheme, it's not gonna happen lol. so here ya go.
www.miuimods.com/bigrushdog/CDMATools.tar.bz2
compile in eclipse then sign with aosp platorm test keys. I use some libraries that require it be compiled in eclipse. and because of the internal telephony imports, it must have the platform signature. Also, because it's compiled in eclipse, i use reflection to access internal telephony. Of interest will be the Phone service class. everything else is just ui. Look at how i structure the reflection. You can reflect all the internal telephony calls in the same fashion. any issues, just hit me. but my time now is very limited as I have a large team to manage and my plate is more than full. Good luck!
feel free to post on github, distribute, or whatever.
So I have not yet had time to modify OP app or using bigrushdog's CDMAtools. But I will soon. In the meantime I just post the internals for the _OEM_HOOK_ statements.
The code base is here:
https://android.googlesource.com/platform/frameworks/base.git/
So from the JellyBean RIL.java:
Code:
[SIZE=2]public void invokeOemRilRequestRaw(byte[] data, Message response) {
RILRequest rr = RILRequest.obtain(RIL_REQUEST_OEM_HOOK_RAW, response);
if (RILJ_LOGD) riljLog(rr.serialString() + "> " + requestToString(rr.mRequest)
+ "[" + IccUtils.bytesToHexString(data)
+ "]");
rr.mp.writeByteArray(data);
send(rr);
}
public void invokeOemRilRequestStrings(String[] strings, Message response) {
RILRequest rr = RILRequest.obtain(RIL_REQUEST_OEM_HOOK_STRINGS, response);
if (RILJ_LOGD) riljLog(rr.serialString() + "> " + requestToString(rr.mRequest));
rr.mp.writeStringArray(strings);
send(rr);
}[/SIZE]
And the expected response can be found in the ril.h:
Code:
[SIZE=2]#define [B]RIL_REQUEST_DATA_REGISTRATION_STATE[/B] 21
/**
* RIL_REQUEST_DATA_REGISTRATION_STATE
*
* Request current DATA registration state
*
* "data" is NULL
* "response" is a "char **"
* ((const char **)response)[0] is registration state 0-5 from TS 27.007 10.1.20 AT+CGREG
* ((const char **)response)[1] is LAC if registered or NULL if not
* ((const char **)response)[2] is CID if registered or NULL if not
* ((const char **)response)[3] indicates the available data radio technology,
* valid values as defined by RIL_RadioTechnology.
* ((const char **)response)[4] if registration state is 3 (Registration
* denied) this is an enumerated reason why
* registration was denied. See 3GPP TS 24.008,
* Annex G.6 "Additonal cause codes for GMM".
* 7 == GPRS services not allowed
* 8 == GPRS services and non-GPRS services not allowed
* 9 == MS identity cannot be derived by the network
* 10 == Implicitly detached
* 14 == GPRS services not allowed in this PLMN
* 16 == MSC temporarily not reachable
* 40 == No PDP context activated
* ((const char **)response)[5] The maximum number of simultaneous Data Calls that can be
* established using RIL_REQUEST_SETUP_DATA_CALL.
*
* [/SIZE]The values at offsets 6..10 are [COLOR=Blue][B]optional LTE[/B][/COLOR] location information in decimal.[SIZE=2]
* If a value is unknown that value may be NULL. If all values are NULL,
* none need to be present.
* ((const char **)response)[6] is TAC, a 16-bit Tracking Area Code.
* ((const char **)response)[7] is CID, a 0-503 Physical Cell Identifier.
* ((const char **)response)[8] is ECI, a 28-bit E-UTRAN Cell Identifier.
* ((const char **)response)[9] is CSGID, a 27-bit Closed Subscriber Group Identity.
* [COLOR=Red]((const char **)response)[10] is TADV, a 6-bit timing advance value.[/COLOR]
*
* LAC and CID are in hexadecimal format.
* valid LAC are 0x0000 - 0xffff
* valid CID are 0x00000000 - 0x0fffffff
*
* Please note that registration state 4 ("unknown") is treated
* as "out of service" in the Android telephony system
*
* Valid errors:
* SUCCESS
* RADIO_NOT_AVAILABLE
* GENERIC_FAILURE
*/
#define [B]RIL_REQUEST_OEM_HOOK_RAW[/B] 59
/**
* RIL_REQUEST_OEM_HOOK_RAW
*
* This request reserved for OEM-specific uses. It passes raw byte arrays
* back and forth.
*
* It can be invoked on the Java side from
* com.android.internal.telephony.Phone.invokeOemRilRequestRaw()
*
* "data" is a char * of bytes copied from the byte[] data argument in java
* "response" is a char * of bytes that will returned via the
* caller's "response" Message here:
* (byte[])(((AsyncResult)response.obj).result)
*
* An error response here will result in
* (((AsyncResult)response.obj).result) == null and
* (((AsyncResult)response.obj).exception) being an instance of
* com.android.internal.telephony.gsm.CommandException
*
* Valid errors:
* All
*/
#define [B]RIL_REQUEST_OEM_HOOK_STRINGS[/B] 60
/**
* RIL_REQUEST_OEM_HOOK_STRINGS
*
* This request reserved for OEM-specific uses. It passes strings
* back and forth.
*
* It can be invoked on the Java side from
* com.android.internal.telephony.Phone.invokeOemRilRequestStrings()
*
* "data" is a const char **, representing an array of null-terminated UTF-8
* strings copied from the "String[] strings" argument to
* invokeOemRilRequestStrings()
*
* "response" is a const char **, representing an array of null-terminated UTF-8
* stings that will be returned via the caller's response message here:
*
* (String[])(((AsyncResult)response.obj).result)
*
* An error response here will result in
* (((AsyncResult)response.obj).result) == null and
* (((AsyncResult)response.obj).exception) being an instance of
* com.android.internal.telephony.gsm.CommandException
*
* Valid errors:
* All
*/
[SIZE=2]#define [COLOR=Purple][B]RIL_REQUEST_RADIO_POWER[/B][/COLOR] 23
/**
* RIL_REQUEST_RADIO_POWER
*
* Toggle radio on and off (for "airplane" mode)
* If the radio is is turned off/on the radio modem subsystem
* is expected return to an initialized state. For instance,
* any voice and data calls will be terminated and all associated
* lists emptied.
*
* "data" is int *
* ((int *)data)[0] is > 0 for "Radio On"
* ((int *)data)[0] is == 0 for "Radio Off"
*
* "response" is NULL
*
* Turn radio on if "on" > 0
* Turn radio off if "on" == 0
*
* Valid errors:
* SUCCESS
* RADIO_NOT_AVAILABLE
* GENERIC_FAILURE
*/
[/SIZE]
[/SIZE]
The last one seem to imply that there could be other "bytes" to be read from that data[] array... Some suggestions have been the Transmitted power... But this was assuming a Qualcomm based modem, so there is no telling if there is another standard there. We don't know...
Here are links to two highly relevant threads on Gmane/GoogleGroups with code excerpts similar to what we need...
http://article.gmane.org/gmane.comp.handhelds.android.ndk/10555
http://article.gmane.org/gmane.comp.handhelds.android.platform/8436
https://groups.google.com/forum/?fromgroups=#!topic/android-platform/tVyNMnXtcEI
With a link to Google Phone G1 Field Test:
http://phoneftd.blogspot.com/2009/03/google-g1-phone-field-test.html

RasPiPhrase (Catchprase-like Code/RPiDevice)

This program is written in C and imports a few python scripts for LCD related actions(a few scripts provided by Adafruit). Basically, the program extends features of the classic Catchphrase Electronic game by allowing a variable number of categories and words for each. An overall synopsis of methods used to drive this program and device are posix threads, signals, PWM, gpio, multi-dimensional array allocation, unix directory commands, and more. Furthermore, the directory of word files is opened (each file formatted as "CATEGORY_.txt"), and the names of each text file (minus the underscore and extension) are loaded in as categories. Next, a three dimensional array is allocated to hold each category and its corresponding words loaded in as strings.
The code is thoroughly commented, and is hopefully understandable. The hardware used was an Adafruit LCD(i2c), four momentary push-button switches, Raspberry Pi, 3V (300-500Hz rated) buzzer, and enclosure (no rechargeable battery added because of the extra money involved but could easily be used as well). The rules are very similar to that of the normal game. During the timer sequence (~55s) when team-mates are guessing a word, pulse width modulation is used to simulate a 400Hz square wave with a 15s period and 50% duty cycle (calculation used to achieve this explained in code). This value is changed over the timer interval to increase the speed of the pulsing to indicate nearing the end of the sequence. Honestly, it probably would have been better to separate some of the functions into other modules, but I decided against it for this release for simplicity sake.
I hope this may be useful for someone interested in creating their own RasPiPhrase device. If you notice any bugs or issues please let me know and I will correct it. The necessary external libraries are wiringPi and python2.7-dev, the rest are a part of unix/C. A makefile that I wrote is included (in the attached zip) as well as all necessary code related files. Some example word files are included too, but you can drop in as many as you want and modify them however you want. The code takes care of the rest. This is the first version and has been tested, though not extensively. Let me know if you have any questions about the code. Enjoy!
-Chris
Code:
/*
============================================================================
Name : RasPiPhrase.c
Author : Christopher B. Harvey
Version : 1.0
Description : Top-level part of program that completes initial setup,
* waits during game, and then cleans up resources at the end.
============================================================================
*/
#include "/usr/include/python2.7/Python.h"
#include <unistd.h>
#include <errno.h>
#include <stdio.h>
#include <wiringPi.h>
#include <malloc.h>
#include <time.h>
#include <string.h>
#include <sys/resource.h>
#include <math.h>
#include <signal.h>
#include <pthread.h>
#include <dirent.h>
#include "lcd.h"
//gpio pin assignments
#define NUM_CAT 3 //number of categories not including "Everything" (modify if add more)
#define BUZZER 1 //buzzer pin for gpio
#define CAT_NXT 6 //category and next switch pin for gpio
#define START 0 //START and STOP switch pin for gpio
#define TEAM_ONE 12 //team one score switch pin for gpio
#define TEAM_TWO 13 //team two score switch pin for gpio
//for importing python script
PyObject *pName, *pModule, *pDict, *pFunc1, *pFunc2, *pFunc3, *pFunc4, *pVal;
//number of files found in words directory
//and therefore the number of categories
int num_fnames=0;
int num_cat=0;
//currently selected category
//default is EVERYTHING
int category_g=0;
//integer array holding number of strings in each category
int *num_strcat=NULL;
//array of strings holding filenames
char **fname_arr=NULL;
//array of category names
char **cat_arr=NULL;
//three dimensional array of strings
//to hold arrays of each category's words
//Example: meta_arr[1][2] would be the third string of the first
// actual category(index 0 holds empy EVERYTHING category)
char ***meta_arr=NULL;
//string to hold msg to be output to lcd
char *lcd_str=NULL;
//one character strings(plus null) to hold converted integer scores
char *t1=NULL;
char *t2=NULL;
//child thread for buzzer timer
pthread_t tim_thread;
pthread_attr_t attr;//attribute for thread stack size
//vars to hold pwm calculations
float clkdiv=0;
float range_per=0;
//veriables for switching between
//different sections of interrupt functions
int irupt1=0;
int irupt2=0;
int irupt3=0;
//vars for keeping up with scores
int team1=0;
int team2=0;
//prototypes
void load_words(void);
void display_word(void);
int get_numstr(char *filename);
int get_rand(int numof);
void rm_fext(char *fname);
void load_fnames_cat(char *dirstr);
void print_cat(void);
void menu(void);
void sighand_timer(int sig);
float clk_div(float freq);
float range(float freq, float period);
void *timer_buzz(void *ptr);
void change_cat_point(void);
void team_one_p(void);
void team_two_p(void);
void start_buzz_stop(void);
void assign_point_buzz(void);
void debounce_sw(int pin);
int main(void) {
//local var declarations
char c;
int i,b,nstr=0;
//load filenames and categories into arrays
load_fnames_cat("./words");
//set 256Mb for stack size to be allocated when
//pthread_create is called for timer thread
size_t stk_alloc = 256*1024*1024;
pthread_attr_init(&attr);
pthread_attr_setstacksize(&attr, stk_alloc);//prevents stack overflow issues
//allocate memory for lcd msg strings and score strings
lcd_str=(char*)malloc(50*sizeof(char));
t1=(char*)malloc(3*sizeof(char));
t2=(char*)malloc(3*sizeof(char));
//load word lists from all proper files in words directory into memory
load_words();
//seed random number generator
srand(time(0));
//start python interpretation
Py_Initialize();
PyEval_InitThreads();
//initialize wiringPi
if (wiringPiSetup() < 0) {
fprintf(stderr, "ERROR! WiringPi was unable to start: %s\n", strerror(errno));
return 1;
}
//set buttons as input
pinMode(CAT_NXT,INPUT);
pinMode(START,INPUT);
pinMode(TEAM_ONE,INPUT);
pinMode(TEAM_TWO,INPUT);
//set all buttons to pull down
pullUpDnControl(CAT_NXT,PUD_DOWN);
pullUpDnControl(START,PUD_DOWN);
pullUpDnControl(TEAM_ONE,PUD_DOWN);
pullUpDnControl(TEAM_TWO,PUD_DOWN);
//load python script file
SCRIPT_LOAD("lcd_funcs");
//startup sequence
LCD_INIT();
LCD_OUT("RasPiPhrase\n Version 1.0");
sleep(2);
LCD_OUT(" Created by\nChris B. Harvey");
sleep(2);
LCD_CLEAR();
//set interrupts
wiringPiISR(CAT_NXT, INT_EDGE_FALLING,change_cat_point);
wiringPiISR(START,INT_EDGE_FALLING,start_buzz_stop);
wiringPiISR(TEAM_ONE, INT_EDGE_FALLING, team_one_p);
wiringPiISR(TEAM_TWO, INT_EDGE_FALLING, team_two_p);
//start game by loading category select menu
menu();
scanf("%c",&c);//wait for key input to end
LCD_CLEAR();
LCD_BLIGHT(OFF);
//dereference python objects
CLEAN_PY();
//end python interpretation
Py_Finalize();
//properly free all dynamically allocated memory
for(i=0;i<num_cat;i++){
nstr=num_strcat[i];
for(b=0;b<nstr;b++){
free(meta_arr[i][b]);
}
free(meta_arr[i]);
}
free(meta_arr);
free(lcd_str);
free(t1);
free(t2);
return 0;
}
//allocate space for string arrays and load file names and categories
//file format should be CATEGORY_.txt
void load_fnames_cat(char *dirstr){
int prev_mem=0,cur_mem=0;
//directory related declarations
DIR *word_direc;
struct dirent *dir;
//open the supplied directory
word_direc = opendir(dirstr);
//if successful read in the directory
if (word_direc){
//inc vars for first allocation
num_fnames++;
cur_mem=18+prev_mem;
//initial allocation for "dummy" EVERYTHING category
fname_arr=realloc(fname_arr,sizeof(char*)*num_fnames);
fname_arr[num_fnames-1]=realloc(fname_arr[num_fnames-1],sizeof(char)*cur_mem);
cat_arr=realloc(cat_arr,sizeof(char*)*num_fnames);
cat_arr[num_fnames-1]=realloc(cat_arr[num_fnames-1],sizeof(char)*cur_mem);
//set vars
prev_mem=cur_mem;
strcpy(fname_arr[num_fnames-1],"EVERYTHING_.txt");//doesn't exist just helps with coding clarity
strcpy(cat_arr[num_fnames-1],"EVERYTHING");
//allocate array of strings for filenames and categories
//and place them into the arrays
while ((dir=readdir(word_direc))!=NULL){
//attempt to insure only correct files are selected and temp files/others are ignored
if((strlen(dir->d_name)>2)&&(rindex(dir->d_name,'_')!=NULL)&&(rindex(dir->d_name,'~')==NULL)){
//inc vars
num_fnames++;
cur_mem=18+prev_mem;
//allocate array mem
fname_arr=realloc(fname_arr,sizeof(char*)*num_fnames);
fname_arr[num_fnames-1]=realloc(fname_arr[num_fnames-1],sizeof(char)*cur_mem);
cat_arr=realloc(cat_arr,sizeof(char*)*num_fnames);
cat_arr[num_fnames-1]=realloc(cat_arr[num_fnames-1],sizeof(char)*cur_mem);
//set vars
prev_mem=cur_mem;
strcpy(fname_arr[num_fnames-1],(dir->d_name));
strcpy(cat_arr[num_fnames-1],fname_arr[num_fnames-1]);
//remove file extension and underscore from filenames
rm_fext(cat_arr[num_fnames-1]);
}
}
//clean up
closedir(word_direc);
//set number of categories
num_cat=num_fnames;
}
}
void load_words(void){
FILE *fp;
int i,b,temp_strnum=0;
char dir[]="./words/";
char str_pathfile[40];
//allocate space for each category for both
meta_arr=(char***)malloc(sizeof(char**)*num_cat);
num_strcat=(int*)malloc(sizeof(int)*num_cat);
for(i=0;i<num_cat;i++){
if(i==0){
//blank for everything category
meta_arr[i]=(char**)malloc(sizeof(char*)*1);
meta_arr[i][i]=(char*)malloc(1*sizeof(char));
meta_arr[i][i][i]='\0';
num_strcat[0]=1;//doesnt matter (EVERYTHING category)
}
else{
//format string to be path to file
strcpy(str_pathfile,dir);
strcat(str_pathfile,fname_arr[i]);
temp_strnum=get_numstr(str_pathfile);
//set number of strings for current category
num_strcat[i]=temp_strnum;
//open file for reading
fp=fopen(str_pathfile,"r");
//allocate memory for pointer to each string and each string itself
meta_arr[i]=(char**)malloc(sizeof(char*)*temp_strnum);
for(b=0;b<temp_strnum;b++){
meta_arr[i][b]=(char*)malloc(16*sizeof(char));
//load in word from file into memory
if((fscanf(fp,"%s",meta_arr[i][b]))!=1){
printf("\n\nERROR loading\nwords from file!\n");
sleep(2);
exit(1);
}
}
fclose(fp);
}
}
}
//output string constructed from current word
//as well as converted team scores
void display_word(void){
char *str=NULL;
int tmp_rand=0;//for use in recursion for everything category
//convert integer score to character
//and put into the correct string
t1[0]=(char)(((int)'0')+team1);
t2[0]=(char)(((int)'0')+team2);
t1[1]='\0';
t2[1]='\0';
if(category_g!=0){
//randomly select string
str=meta_arr[category_g][get_rand(num_strcat[category_g])];
}
else{
//randomly select category and string for EVERYTHING
tmp_rand=get_rand(num_cat);
while(tmp_rand==0){
tmp_rand=get_rand(num_cat);
}
str=meta_arr[tmp_rand][get_rand(num_strcat[tmp_rand])];
}
//construct string from scores and word then output to lcd
strcpy(lcd_str,str);
strcat(lcd_str,"\nTEAM1:");
strcat(lcd_str,t1);
strcat(lcd_str," TEAM2:");
strcat(lcd_str,t2);
LCD_OUT(lcd_str);
}
//prints string constructed from current category and each team's score
void print_cat(void){
//convert ints to characters and put into string
t1[0]=(char)(((int)'0')+team1);
t2[0]=(char)(((int)'0')+team2);
t1[1]='\0';
t2[1]='\0';
//construct string from scores and category name then output to lcd
strcpy(lcd_str,cat_arr[category_g]);
strcat(lcd_str,"\nTEAM1:");
strcat(lcd_str,t1);
strcat(lcd_str," TEAM2:");
strcat(lcd_str,t2);
LCD_OUT(lcd_str);
}
//opens the specified file and
//counts the number of strings
int get_numstr(char *filename){
int fsize=0;
char *string=NULL;
string=(char*)malloc(20*sizeof(char));
FILE *fp;
fp=fopen(filename,"r");
while(fscanf(fp,"%s",string)==1){
fsize++;
}
fclose(fp);
free(string);
return fsize;
}
//gets a random number within range
//specified by number of items
int get_rand(int numof){
int random=rand()%numof;
return random;
}
//function to remove file extension and underscore
void rm_fext(char *fname){
char *ptr_todot=rindex(fname,'_');//find memory where the dot is stored
if(ptr_todot!=NULL){
(*ptr_todot)='\0';//replace with terminator to basically remove extension
}
}
//category select menu
void menu(void){
category_g=0;
LCD_OUT("Please select\n a category.")
sleep(1);
print_cat();
irupt1=1;
}
//function to debounce button on supplied pin
void debounce_sw(int pin){
int cnt=0,wait=1;
while(wait){
usleep(1000);//sleep 10ms
if(digitalRead(pin)==1){
cnt=0;
//continue to wait for bouncing to stop
}
else{
cnt++;
}
if(cnt>=10){
//we can now end loop
wait=0;
}
}
}
//interrupt function for when the start/stop button is pressed
void start_buzz_stop(void){
debounce_sw(START);
//set start/stop button to pulldown
pullUpDnControl(START,PUD_DOWN);
if(irupt1){
irupt1=0;
pthread_create(&tim_thread,&attr,timer_buzz,NULL);
}
else if(irupt2){
irupt2=0;
pthread_kill(tim_thread,SIGUSR1);
pthread_join(tim_thread,NULL);//make sure to wait until child thread terminates
pwmWrite(BUZZER,0);
digitalWrite(BUZZER, LOW);
menu();
}
}
//interrupt function for when the category/next button is pressed
void change_cat_point(void){
debounce_sw(CAT_NXT);
//set category/next button to pull down
pullUpDnControl(CAT_NXT,PUD_DOWN);
if(irupt1){
category_g++;
if(category_g>=num_cat){
category_g=0;//wrap back around
}
print_cat();
}
else if(irupt2){
irupt2=0;
pthread_kill(tim_thread,SIGUSR1);
pthread_join(tim_thread,NULL);//make sure to wait until child thread terminates
pwmWrite(BUZZER,0);
digitalWrite(BUZZER, LOW);
LCD_OUT("Assign point to\nthe proper team.");
irupt3=1;
}
}
//function to assign point after timer expires
void assign_point_buzz(void) {
irupt2=0;
LCD_OUT("Time expired!!!!");
pthread_join(tim_thread,NULL);//make sure to wait until child thread terminates
sleep(1);
LCD_OUT("Assign point to\nthe proper team.");
irupt3=1;
}
//interrupt function to give point when team one button is pressed
void team_one_p(void){
debounce_sw(TEAM_ONE);
//set pull down for score buttons
pullUpDnControl(TEAM_ONE,PUD_DOWN);
pullUpDnControl(TEAM_TWO,PUD_DOWN);
if(irupt3){
irupt3=0;
team1++;
if(team1>=7){
LCD_OUT(" TEAM1 WINS!!!");
sleep(2);
team1=0;
team2=0;
}
menu();
}
}
//interrupt function to give point when team two button is pressed
void team_two_p(void){
debounce_sw(TEAM_TWO);
//set pull down for score buttons
pullUpDnControl(TEAM_ONE,PUD_DOWN);
pullUpDnControl(TEAM_TWO,PUD_DOWN);
if(irupt3){
irupt3=0;
team2++;
if(team2>=7){
LCD_OUT(" TEAM2 WINS!!!");
sleep(2);
team1=0;
team2=0;
}
menu();
}
}
//signal handler for SIGUSR1 signal when
//sent to timer child thread
void sighand_timer(int sig){
pthread_exit(0);//end child thread
}
//function called by child thread to begin timer and buzzer sequence
void *timer_buzz(void *ptr) {
signal(SIGUSR1,sighand_timer);//call function when SIGUSR1 is caught
irupt1=0;
irupt2=1;
display_word();
clk_div(400); //400 Hz, clock_div func supplies correct value
range(400, 15); //15s Period, range func supplies correct value
pinMode(BUZZER, PWM_OUTPUT);
//and set its value to 0
pwmSetMode(PWM_MODE_MS);
pwmWrite(BUZZER, 0);
digitalWrite(BUZZER, LOW);
//bring frequency to 300Hz
//originally 19.2MHz PWM Clock
pwmSetClock(clkdiv);
pwmSetRange(range_per);
//begin pwm writing sequency to produce
//square waves that mimic the catchphrase timer
pwmWrite(BUZZER, (range_per * .25));
sleep(12);
pwmSetRange(range(400, 12));
pwmWrite(BUZZER, (range_per * .50));
sleep(10);
pwmSetRange(range(400, 10));
pwmWrite(BUZZER, (range_per * .75));
sleep(10);
pwmSetRange(range(400, 8));
pwmWrite(BUZZER, (range_per * .90));
sleep(10);
pwmSetRange(range(400, 6));
pwmWrite(BUZZER, (range_per * .90));
sleep(10);
pwmSetClock(clk_div(300));
pwmSetRange(range(300, 15));
pwmWrite(BUZZER, range_per);
sleep(1);
assign_point_buzz();
sleep(1);
pwmWrite(BUZZER, 0);
digitalWrite(BUZZER, LOW);
return 0;
}
//calculates pwm clock divisor for a frequency in Hz
float clk_div(float freq) {
clkdiv = 19.2 * (pow(10, 6)) / freq;
return clkdiv;
}
//calculates pwm range for a period in seconds and a frequency in Hz
float range(float freq, float period) {
range_per = period / (1 / freq);
return range_per;
}
Code:
/*
============================================================================
Name : lcd.h
Author : Christopher B. Harvey
Version : 1.0
Description : Header that contains function-like macro definitions
* for lcd related tasks through use of python scripts.
============================================================================
*/
//functionality defines for lcd backlight function
#define ON 1
#define OFF 0
#define SCRIPT_LOAD(pyfile) \
pName = PyString_FromString(pyfile); \
PyRun_SimpleString("import sys"); \
PyRun_SimpleString("sys.path.append(\"/home/rasppi/src/py_scripts\")"); \
pModule = PyImport_Import(pName); \
pDict = PyModule_GetDict(pModule); \
#define LCD_INIT() \
pFunc1 = PyDict_GetItemString(pDict, "lcdinit"); \
if (PyCallable_Check(pFunc1)){ \
PyObject_CallObject(pFunc1, NULL); \
} else { \
PyErr_Print(); } \
#define LCD_OUT(stri) \
pFunc2 = PyDict_GetItemString(pDict, "lcdout"); \
if (PyCallable_Check(pFunc2)){ \
PyObject_CallFunction(pFunc2, "s", stri); \
} else { \
PyErr_Print(); } \
#define LCD_BLIGHT(i) \
pFunc3 = PyDict_GetItemString(pDict, "lcdbacklight"); \
if (PyCallable_Check(pFunc3)){ \
PyObject_CallFunction(pFunc3, "i", i); \
} else { \
PyErr_Print(); } \
#define LCD_CLEAR() \
pFunc4 = PyDict_GetItemString(pDict, "lcdclear"); \
if (PyCallable_Check(pFunc4)){ \
PyObject_CallObject(pFunc4, NULL); \
} else { \
PyErr_Print(); } \
#define CLEAN_PY() \
Py_DECREF(pModule); \
Py_DECREF(pName); \

[Q] app slow down when using sqlite database

I am trying to create a circular buffer using sqlite. For some reason every time I instantiate my db access class, the os start skipping frames (I am using the emulator to run my code).
02-22 20:22:03.172: I/Choreographer(860): Skipped 628 frames! The application may be doing too much work on its main thread.
I do not understand what I am doing wrong. I am calling the database class from an intentService (I assume this should not slow down the main thread at all) as follows:
Code:
private SqliteLog mSqliteLog;
mSqliteLog = new SqliteLog(context);
mSqliteLog.writelogInformation("sleepMode", "ON");
I added my code at the end of this message
Code:
/**
* SqliteLog
*
*
* Base class for sqliteLog control
*
*
*/
public class SqliteLog {
// Debug log tag
private static final String tag = "SqliteLog";
// Version of database
private static final int DATABASE_VERSION = 1;
// Name of database
private static final String DATABASE_NAME = "log";
// Table of database
private static final String TABLE_NAME = "log";
public static final String ROWID_NAME = "id";
public static final String PREFERENCE_NAME = tag + "Pref";
public static final String COLUMN_LOGNUMBER = "logNumber";
public static final String COLUMN_TIME = "time";
public static final String COLUMN_FUNCTION = "function";
public static final String COLUMN_DESCRIPTION = "description";
public static final int TABLE_SIZE = 20;
private static final String DATABASE_CREATE ="create table " + TABLE_NAME + " (" + ROWID_NAME + " integer primary key autoincrement, " +
COLUMN_LOGNUMBER + " INTEGER NOT NULL, " +
COLUMN_TIME + " TEXT NOT NULL, " +
COLUMN_FUNCTION + " TEXT NOT NULL, " +
COLUMN_DESCRIPTION + " TEXT NOT NULL " +
");";
//The context of the calling class;
private Context thisContext;
/**
* <p>Constructor for SqliteLog
* @param context :- Context of calling class
*
*/
public SqliteLog(Context context) {
Log.d(tag,"SqliteLog constructor called");
thisContext = context;
}
/**
* writelogInformation :- Writes a row into the log table
*
*/
public void writelogInformation(String functionName, String descriptionInfo) {
// Retrieve preferences
SharedPreferences SqliteLogPref = thisContext.getSharedPreferences(PREFERENCE_NAME, Context.MODE_PRIVATE);
int logNumber = SqliteLogPref.getInt("logNumber", 1);
// Open database for writing
DatabaseHelper databaseHelper = new DatabaseHelper(thisContext);
SQLiteDatabase sQLiteDatabase = databaseHelper.getWritableDatabase();
// Define the column name and data
ContentValues values = new ContentValues();
values.put(COLUMN_LOGNUMBER, logNumber);
values.put(COLUMN_TIME, getTime());
values.put(COLUMN_FUNCTION, functionName);
values.put(COLUMN_DESCRIPTION, descriptionInfo);
// Update database
sQLiteDatabase.update(TABLE_NAME, values, null, null);
// Close database
databaseHelper.close();
// Test if next database update will need to be wrapped around
logNumber = (logNumber % TABLE_SIZE) + 1;
// Store preferences
SharedPreferences.Editor editor = SqliteLogPref.edit();
editor.putInt("logNumber", logNumber);
editor.commit();
}
/**
* clearLog :- Erase all information from table
*
*/
public void clearLog() {
// Retrieve preferences
SharedPreferences SqliteLogPref = thisContext.getSharedPreferences(PREFERENCE_NAME, 0);
// Store preferences
SharedPreferences.Editor editor = SqliteLogPref.edit();
editor.putInt("logNumber", 1);
editor.commit();
// Delete all rows
DatabaseHelper databaseHelper = new DatabaseHelper(thisContext);
SQLiteDatabase sQLiteDatabase = databaseHelper.getReadableDatabase();
sQLiteDatabase.delete (TABLE_NAME, null, null);
}
/**
* readlogInformation :- Read the whole table
*
*/
public String[] readlogInformation() {
// Create string array of appropriate length
String[] returnArray;
// Retrieve preferences
SharedPreferences SqliteLogPref = thisContext.getSharedPreferences(PREFERENCE_NAME, 0);
int logNumber = SqliteLogPref.getInt("logNumber", 0);
// Open database for reading
DatabaseHelper databaseHelper = new DatabaseHelper(thisContext);
try {
SQLiteDatabase sQLiteDatabase = databaseHelper.getReadableDatabase();
// Get a cursor to the correct cell
Cursor cursor = sQLiteDatabase.query(TABLE_NAME, null, null, null, null, null, null);
// Get number of rows in table
int lengthOfTable = 0;
// Move cursor to where it needs to be
if (cursor != null) {
lengthOfTable = cursor.getCount();
// If count is less than max, then we have not wrapped around yet
if(lengthOfTable < TABLE_SIZE) {
cursor.moveToFirst();
}
// Position cursor appropriately
else {
cursor.moveToPosition(logNumber-1);
}
// Create string array of appropriate length
returnArray = new String[lengthOfTable];
for(int i=1; i<=lengthOfTable; i++) {
returnArray[i] = cursor.getString(1) + "; " + cursor.getString(2) + "; " + cursor.getString(3);
}
}
else {
Log.e(tag,"Cursor null");
// Create string array of appropriate length
returnArray = new String[0];
}
} catch(SQLiteException e) {
Log.d(tag,"SQLiteException when using getReadableDatabase");
// Create string array of appropriate length
returnArray = new String[0];
}
// Close database
databaseHelper.close();
return returnArray;
}
/**
* readlogInformation :- Read the whole table
*
*/
public String getTime() {
// Create a new time object
Time currentTime = new Time(Time.getCurrentTimezone());
// Get current time
currentTime.setToNow();
return currentTime.toString();
}
/**
* DatabaseHelper
*
*
* Class to help control database
*
*
*/
private static class DatabaseHelper extends SQLiteOpenHelper {
/**
* <p>Constructor for DatabaseHelper
* @param context :- Context of calling class<p>
*
*/
DatabaseHelper(Context context) {
super(context, DATABASE_NAME, null, DATABASE_VERSION);
Log.d(tag,"DatabaseHelper constructor called");
}
/**
* <p>onCreate
* @param db :- Pass an sqlite object
*
*/
@Override
public void onCreate(SQLiteDatabase db) {
Log.d(tag,"onCreate called");
// Create database
db.execSQL(DATABASE_CREATE);
// Insert a new row
ContentValues values = new ContentValues();
// Create a certain number of rows
for(int i=1; i<=TABLE_SIZE; i++) {
values.clear();
values.put(COLUMN_LOGNUMBER, i);
values.put(COLUMN_FUNCTION, "empty");
values.put(COLUMN_DESCRIPTION, "empty");
db.insert(TABLE_NAME, "null", values);
}
Log.d(tag,"database created");
}
/**
* <p>onUpgrade
* @param db :- Pass an sqlite object
* @param oldVersion :- Old version of table
* @param newVersion :- New version of table
*
*/
@Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
Log.d(tag,"onUpgrade called");
// Not used, but you could upgrade the database with ALTER
// Scripts
}
}
}
I have been trying to figure this out for a while now. I would appreciate any insight, Amish

error: package com.google.android.maps does not exist

Hi,
i develop a map-app and want to implement google maps search
i get following build error
"error: package com.google.android.maps does not exist"
i develop under "android studio 3.1.3"
Hint for solution were appreciated
thanks
You need to add a reference to the maps package in build.gradle
Sent from my Z2_PRO using Tapatalk
i tested this https://stackoverflow.com/questions...le-android-maps-does-not-exist-android-studio
but without success
Can you post the content of your build.gradle file?
apply plugin: 'com.android.application'
android {
compileSdkVersion 28
defaultConfig {
applicationId "com.example.tux.myapplication"
minSdkVersion 15
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'com.android.support:appcompat-v7:28.0.0-alpha3'
implementation 'com.google.android.gmslay-services-maps:15.0.1'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
}
elvis61 said:
apply plugin: 'com.android.application'
android {
compileSdkVersion 28
defaultConfig {
applicationId "com.example.tux.myapplication"
minSdkVersion 15
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'com.android.support:appcompat-v7:28.0.0-alpha3'
implementation 'com.google.android.gmslay-services-maps:15.0.1'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
}
Click to expand...
Click to collapse
Apologies for the late reply, didn't get a notification from the thread.
Try removing the lines:
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'com.google.android.gmslay-services-maps:15.0.1
And add
implementation 'com.google.android.gmslay-services-maps:16.0.0'
If that doesn't work I'll draw up a demo app - this is a bit odd as I'm actually using maps in an app for my uni coursework and it's working fine
Edit: Also check that you have added the Google maven repo in the top level build.gradle file
hi,
i have still same error. i explained abov my purpose. i try this suggestion
https://stackoverflow.com/questions...ment-google-maps-search-by-address-in-android
and get this error.
and here is next purpose
https://stackoverflow.com/questions...-name-on-google-map-android?noredirect=1&lq=1
i don't know which one is better. i mean which one i can build without error.
hi jonny,
i have now a version it can work for me. but any thing is missed. can you pls add the missing parts.
thanks
public class MapsActivity extends FragmentActivity implements OnMapReadyCallback {
private GoogleMap mMap;
@override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_maps);
// Obtain the SupportMapFragment and get notified when the map is ready to be used.
SupportMapFragment mapFragment = (SupportMapFragment) getSupportFragmentManager()
.findFragmentById(R.id.map);
mapFragment.getMapAsync(this);
}
/**
* Manipulates the map once available.
* This callback is triggered when the map is ready to be used.
* This is where we can add markers or lines, add listeners or move the camera. In this case,
* we just add a marker near Sydney, Australia.
* If Google Play services is not installed on the device, the user will be prompted to install
* it inside the SupportMapFragment. This method will only be triggered once the user has
* installed Google Play services and returned to the app.
*/
@override
public void onMapReady(GoogleMap googleMap) {
mMap = googleMap;
// Add a marker and move the camera
setContentView(R.layout.activity_place);
EditText geoName = (EditText) findViewById(R.id.geoName);
String sGeoName = geoName.getText().toString();
LatLng geoPlace = getLocationFromAddress(this, sGeoName);
mMap.addMarker(new MarkerOptions().position(geoPlace).title(sGeoName));
mMap.moveCamera(CameraUpdateFactory.newLatLng(geoPlace));
}
/**
*
* @param context
* @param strAddress
* @return
*/
public LatLng getLocationFromAddress(Context context, String strAddress) {
Geocoder coder = new Geocoder(context);
List<Address> address;
LatLng p1 = null;
try {
// May throw an IOException
address = coder.getFromLocationName(strAddress, 5);
if (address == null) {
return null;
}
Address location = address.get(0);
p1 = new LatLng(location.getLatitude(), location.getLongitude());
} catch (IOException ex) {
ex.printStackTrace();
}
return p1;
}
}
hi jonny,
i have now a version it can work for me. but any thing is missed. can you pls add the missing parts.
thanks
public class MapsActivity extends FragmentActivity implements OnMapReadyCallback {
private GoogleMap mMap;
@override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_maps);
// Obtain the SupportMapFragment and get notified when the map is ready to be used.
SupportMapFragment mapFragment = (SupportMapFragment) getSupportFragmentManager()
.findFragmentById(R.id.map);
mapFragment.getMapAsync(this);
}
/**
* Manipulates the map once available.
* This callback is triggered when the map is ready to be used.
* This is where we can add markers or lines, add listeners or move the camera. In this case,
* we just add a marker near Sydney, Australia.
* If Google Play services is not installed on the device, the user will be prompted to install
* it inside the SupportMapFragment. This method will only be triggered once the user has
* installed Google Play services and returned to the app.
*/
@override
public void onMapReady(GoogleMap googleMap) {
mMap = googleMap;
// Add a marker and move the camera
setContentView(R.layout.activity_place);
EditText geoName = (EditText) findViewById(R.id.geoName);
String sGeoName = geoName.getText().toString();
LatLng geoPlace = getLocationFromAddress(this, sGeoName);
mMap.addMarker(new MarkerOptions().position(geoPlace).title(sGeoName));
mMap.moveCamera(CameraUpdateFactory.newLatLng(geoPlace));
}
/**
*
* @param context
* @param strAddress
* @return
*/
public LatLng getLocationFromAddress(Context context, String strAddress) {
Geocoder coder = new Geocoder(context);
List<Address> address;
LatLng p1 = null;
try {
// May throw an IOException
address = coder.getFromLocationName(strAddress, 5);
if (address == null) {
return null;
}
Address location = address.get(0);
p1 = new LatLng(location.getLatitude(), location.getLongitude());
} catch (IOException ex) {
ex.printStackTrace();
}
return p1;
}
}
briefly i will edit place name and move the camera. easy. but not for a newbie. pls support me.

Categories

Resources