System Requirement
How to use
Nuwa Developer MODE
Change your adb password
How to install APK
Initiation
Register callback
Check Nuwa Robot SDK engine is ready before app calling any Robot SDK API
Nuwa Motion player
Overlap Face Window control
Voice wakeup
Local TTS
Local ASR
Cloud Speech To Text
Sensor event
LED control
Motor control
Movement control
Face control
Custom Behavior
Safe mode
Handle Robot Drop event
Handle Robot Service recovery
How to shown entrypoint on Robot Menu
Launch Developer App via Voice command
Disable Support Always Wakeup
Setup 3rd App as Launcher app to replace Nuwa Face Activity
How to call nuwa ui to add face recognition
How to change your adb password
Prevent HOME KEY to suspend device
How to Auto Launch 3rd Activity when bootup
Q & A
System Requirement
Kebbi-air is based on Android P. You can use Android Studio or other IDE for android developing. Please refer to Android Studio for more information.
How to use
Get Nuwa SDK aar and import into your android studio project / or unity IDE.
Nuwa Robot SDK supports the following development environment.
Developer could refer to Nuwarobotics official developer website or GitHub to understand the API usage.
NOTICE : Content Editor
only for Business Partner, please contact nuwarobotics for more information.
Nuwa Developer MODE
How to enable developer mode
1. Open Settings
2. click "About Kebbi" 10 times
3. The hidden developer option shown in right page.
How to hide developer option
xxxxxxxxxx
1. Open Settings
2. click "About Kebbi" 10 times
3. All developer option will hidden.
Change your adb password
Allow Developer change adb password to prevent other user access android system.
xxxxxxxxxx
1. Open Setting
2. click "About Kebbi" 10 times to enable developer menu
3. Click "ADB Password Settings" (ADBパスワード設定)
4. Replace default password to your own.
How to install APK
User can use ADB(android debug bridge) directly to install apk. (in Kebbi-Air)
!Q@W#E$R
After adb connected, developer can use adb install
command to install apk.
Initiation
Before using Robot SDK, App needs to to create a single instance of Nuwa Robot API
and do initiation once first.
xxxxxxxxxx
// ID name: please naming your own client id,
// Once mRobot is created, app will receive a callback of onWikiServiceStart()
IClientId id = new IClientId("your_app_package_name");
NuwaRobotAPI mRobot = new NuwaRobotAPI(android.content.Context, id);
// release Nuwa SDK resource while App closed.(or Activity in pause/destroy state)
mRobot.release();
Register callback
Due to all functions of Nuwa Robot SDK are async(Based on AIDL) design, App needs to register a callback to receive all kinds of notifications and results from Robot.
xxxxxxxxxx
mRobot.registerRobotEventListener(new RobotEventListener() {
public void onWikiServiceStart() {
// Once mRobot is created, you will receive a callback of onWikiServiceStart()
}
public void onTouchEvent(int type, int touch) {
}
.......
});
mRobot.registerVoiceEventListener(new VoiceEventListener() {
public void onTTSComplete(boolean isError) {
// TODO Auto-generated method stub
}
.......
});
You could also use helper class to overwrite events that App needs only.
xxxxxxxxxx
mRobot.registerRobotEventListener(new RobotEventCallback() {
public void onTouchEvent(int type, int touch) {
}
});
mRobot.registerVoiceEventListener(new VoiceEventCallback() {
public void onTTSComplete(boolean isError) {
}
});
Check Nuwa Robot SDK engine is ready before app calling any Robot SDK API
Once mRobot is created, you will receive a callback of onWikiServiceStart()
NOTICE : please call api after onWikiServiceStart.
xxxxxxxxxx
mRobot.registerRobotEventListener(new RobotEventListener() {
public void onWikiServiceStart() {
// Once mRobot is created, you will receive a callback of onWikiServiceStart()
}
.......
});
// or check state by
boolean isReady = mRobot.isKiWiServiceReady();
Nuwa Motion player
Nuwa motion
file is nuwa's private Robot motion control format.
Which is composed of "MP4(Face)", "Motor control", "Timeline control", "LED control", etc.
You can just play a motion, and Robot will do a serials of pre-defined actions.
PS: Some motions only include "body movements" without "MP4(Face)"
motionPlay(final String motion, final boolean auto_fadein)
motionPlay(final String name, final boolean auto_fadein, final String path)
motionPrepare(final String name)
motionPlay( )
motionStop(final boolean auto_fadeout)
getMotionList( )
motionSeek(final float time)
motionPause( )
motionResume( )
motionCurrentPosition( )
motionTotalDuration( )
xxxxxxxxxx
/*
will get callbacks of onStartOfMotionPlay onStartOfMotionPlay(String motion)
while finish plaing, will get callbacks of
onStopOfMotionPlay(String motion)
onCompleteOfMotionPlay(String motion)
// if there is an error, App will receive the error callback
public void onErrorOfMotionPlay(int errorcode);
*/
// use default NUWA motion asset path
mRobot.motionPlay("001_J1_Good", true/false);
// give a specefic motion asset path (internal use only)
mRobot.motionPlay("001_J1_Good", true/false, "/sdcard/download/001/");
//NOTICE:Please must call motionStop(true), if your auto_fadein is true.
// will get callback of onStopOfMotionPlay(String motion)
mRobot.stop(true/false);
// preload motion, and then playback
// callback to onPrepareMotion(boolean isError, String motion, float duration)
mRobot.motionPrepare("001_J1_Good")
// pause, resume and seek
mRobot.motionPause();
float pos = mRobot.motionCurrentPosition();
mRobot.motionResume();
mRobot.motionSeek(1.5f); // seek to 1.5 sec
Overlap window control
While playing a motion, there is a overlap window on the top of screen, you can do show or hide function with the window.
xxxxxxxxxx
mRobot.showWindow(false);
mRobot.hideWindow(false);
Voice wakeup
You can say "Hello Kebbi"
to get a wakeup event callback in your Appication context ONCE
, and robot system behavior NOT response on this time.
xxxxxxxxxx
mRobot.startWakeUp(true);
// When Robot hear "Hello Kebbi",Kiwi agent will receive a callback onWakeup() of VoiceEventListener once
onWakeup(boolean isError, String score) {
// score: a json string given by voice engine
// isError: if engine works normally
// Ex: {"eos":3870,"score":80,"bos":3270,"sst":"wakeup","id":0}
// score: confidence value
}
// stop listen to wakeup
mRobot.stopListen();
Local TTS
Robot can speak out a sentance by a giving string
xxxxxxxxxx
mRobot.startTTS("Nice to meet you");
// you can cancel speaking at any time
mRobot.stopTTS();
// or Used specific language speak (Please reference TTS Capability for market difference)
// mRobotAPI.startTTS(TTS_sample, Locale.ENGLISH.toString());
// receive callback onTTSComplete(boolean isError) of VoiceEventListener
onTTSComplete(boolean isError) {
}
TTS Capability (Only support on Kebbi Air)
xxxxxxxxxx
* Taiwan Market : Locale.CHINESE\Locale.ENGLISH
* Chinese Market : Locale.CHINESE\Locale.ENGLISH
* Japan Market : Locale.JAPANESE\Locale.CHINESE\Locale.ENGLISH
* Worldwide Market : Locale.ENGLISH
Local ASR
ASR engine use SimpleGrammarData to describe the list of keyword, App needs to create Grammar first. (Command table)
Example
xxxxxxxxxx
//list listen syntax
ArrayList<String> cmd = new ArrayList<>();
cmd.add("I want to listen music");
cmd.add("play pop music");
cmd.add("listen love song");
//Make Grammar Object by SimpleGrammarData class
SimpleGrammarData mGrammarData = new SimpleGrammarData("TutorialTest");
for (String string : cmd) {
mGrammarData.addSlot(string);
}
//update config
mGrammarData.updateBody();
//Regist local ASR syntax
mRobot.createGrammar(mGrammarData.grammar, mGrammarData.body);
// stop ASR operation
mRobot.stopListen();
VoiceEventListener Callback
xxxxxxxxxx
// receive a GrammarState callback of VoiceEventListener
public void onGrammarState(boolean b, String s) {
// now you can call startLocalCommand() API
mRobot.startLocalCommand();
}
// receive a Understand callback of VoiceEventListener
public void onMixUnderstandComplete(boolean b, ResultType resultType, String s) {
//Get ASR result string here
String result_string = VoiceResultJsonParser.parseVoiceResult(s);
}
Cloud Speech To Text
and Local ASR
mix operationWhile doing ASR Mix mode, engine will receive results from local and cloud, engine will only return one of both. Rules are:
PS: Cloud ASR and local ASR result are json
format string.
createGrammar(final String grammar_id, final String grammar_string)
startMixUnderstand( )
stopListen( )
Example
xxxxxxxxxx
//list listen syntax
ArrayList<String> cmd = new ArrayList<>();
cmd.add("I want to listen music");
cmd.add("play pop music");
cmd.add("listen love song");
//Make Grammar Object by SimpleGrammarData class
SimpleGrammarData mGrammarData = new SimpleGrammarData("TutorialTest");
for (String string : cmd) {
mGrammarData.addSlot(string);
}
//update config
mGrammarData.updateBody();
//Regist local ASR syntax
mRobot.createGrammar(mGrammarData.grammar, mGrammarData.body);
// stop ASR operation
mRobot.stopListen();
VoiceEventListener Callback
xxxxxxxxxx
// receive a callback of VoiceEventListener
onGrammarState(boolean isError, String info) {
// now you can call startMixUnderstand() API
// do mix mode ASR
mRobot.startMixUnderstand();
}
// get local
onMixUnderstandComplete(boolean isError, ResultType type, String json) {
//Get ASR result string
String result_string = VoiceResultJsonParser.parseVoiceResult(json);
//get ASR type
if (type == ResultType.LOCAL_COMMAND) {
//do something
}
}
Sensor event
Robot provides Touch
and PIR
and Drop
sensor events.
While you need it, you can request it.
While you don't need it, you can stop requesting it.
x public static final int SENSOR_NONE = 0x00; //000000
public static final int SENSOR_TOUCH = 0x01; //000001
public static final int SENSOR_PIR = 0x02; //000010
public static final int SENSOR_DROP = 0x04; //000100
public static final int SENSOR_SYSTEM_ERROR = 0x08; //001000
// request touch sensor event
mRobot.requestSensor(SENSOR_TOUCH);
// or request touch, PIR event and SENSOR_XXX
mRobot.requestSensor(NuwaRobotAPI.SENSOR_TOUCH | NuwaRobotAPI.ENSOR_PIR | NuwaRobotAPI.SENSOR_XXX);
// get raw touch results of RobotEventListener
// type: head: 1, chest: 2, right hand: 3, left hand: 4, left face: 5,right face: 6.
// touch: touched: 1, untouched: 0
onTouchEvent(int type, int touch) {
}
// type: head: 1, chest: 2, right hand: 3, left hand: 4, left face: 5,right face: 6.
onTap(int type) {
}
// type: head: 1, chest: 2, right hand: 3, left hand: 4, left face: 5,right face: 6.
onLongPress(int type) {
}
// get PIR results of RobotEventListener
onPIREvent(int val) {
}
// stop all requested sensor event
mRobot.stopSensor(NuwaRobotAPI.SENSOR_NONE);
// stop multiple sensor events
mRobot.stopSensor(NuwaRobotAPI.SENSOR_TOUCH | NuwaRobotAPI.SENSOR_XXX);
LED control
There are 4 parts of LED on Robot. API can contorl each them.
(Head, Chest, Right hand, Left hand)
Each LED part has 2 types of modles - "Breath mode" and "Light on mode"
Before using it, you need to use API to turn on it first, and turn off it while unneeded.
LED default controled by System. If the App wants to have different behavior, it can be disabled by disableSystemLED()
If your App needs to control Robot LED, App needs to call disableSystemLED() once,and App call enableSystemLED() while App is in onPause state.
NOTICE : Kebbi-air not support FACE and Chest LED Breath mode.
xxxxxxxxxx
/*
id:1 = Face LED
id:2 = Chest LED
id:3 = Left hand LED
id:4 = Right hand LED
onOff: 0 or 1
brightness, Color-R, Color-G, Color-B: 0 ~ 255
interval: 0 ~ 15
ratio: 0 ~ 15
*/
// turn on LED
mRobot.enableLed(1, 1);
mRobot.enableLed(2, 1);
mRobot.enableLed(3, 1);
mRobot.enableLed(4, 1);
// Set LED color
mRobot.setLedColor(1, 255, 255, 255, 255);
mRobot.setLedColor(2, 255, 255, 0, 0);
mRobot.setLedColor(3, 255, 166, 255, 5);
mRobot.setLedColor(4, 255, 66, 66, 66);
// Switch to "Breath mode"
mRobot.enableLedBreath(1, 2, 9);
// turn off LED
mRobot.enableLed(1, 0);
mRobot.enableLed(2, 0);
mRobot.enableLed(3, 0);
mRobot.enableLed(4, 0);
Motor control
Mibo Robot has 10 motors, use the API, you can control each of them
ctlMotor (int motor, float degree, float speed)
motorid:
degree : move degree (reference following angle range table)
speed : degree/sec speed
ID | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
---|---|---|---|---|---|---|---|---|---|---|
Max | 20 | 40 | 5 | 70 | 100 | 0 | 5 | 70 | 100 | 0 |
Min | -20 | -40 | -85 | -200 | -3 | -80 | -85 | -200 | -3 | -80 |
x// control neck_y motor to 20 degree in 30 Degree/sec speed
mRobot.ctlMotor(1, 20, 30f);
Movement control
To control Robot to forward, backwards, turns, stop
Low level control:
Advanced control:
xxxxxxxxxx
// go forward
mRobot.forwardInAccelerationEx();
// go back
mRobot.backInAccelerationEx();
// stop
mRobot.stopInAccelerationEx();
Face control
IMPORTANT NOTICE :
The Robot Face is a Activity, so the controller must be SERVICE on background.
Face control API:
Example
xxxxxxxxxx
---------------------------------------------------------------------------
// get control object
UnityFaceManager mFaceManager = UnityFaceManager.getInstance();
---------------------------------------------------------------------------
/**
* open face mouth method
* @param speed : Mouth animation speed
*/
long speak = 300l;
mFaceManager.mouthOn(speed);
---------------------------------------------------------------------------
/**
* close face mouth method
*/
mFaceManager.mouthOff();
---------------------------------------------------------------------------
/**
* hide face method
*/
mFaceManager.hideUnity();
---------------------------------------------------------------------------
/**
* show face method
*/
mFaceManager.showUnity();
---------------------------------------------------------------------------
/**
* play unity face motion method
* @param json : unity face motion key
* @param mListener : callback for motion complete
*/
String json = "J2_Hug";
IonCompleteListener.Stub mListener = new IonCompleteListener.Stub() {
public void onComplete(String s) throws RemoteException {
Log.d("FaceControl", "onMotionComplete:" + s );
}
};
mFaceManager.playMotion(json,mListener);
---------------------------------------------------------------------------
/**
* regist the callback that face status on touch
* @param UnityFaceCallback : over write callback receive method
*/
mController.registerCallback(new UnityFaceCallback());
class UnityFaceCallback extends UnityFaceCallback {
public void on_touch_left_eye() {
Log.d("FaceControl", "on_touch_left_eye()");
}
public void on_touch_right_eye() {
Log.d("FaceControl", "on_touch_right_eye()");
}
public void on_touch_nose() {
Log.d("FaceControl", "on_touch_nose()");
}
public void on_touch_mouth() {
Log.d("FaceControl", "on_touch_mouth()");
}
public void on_touch_head() {
Log.d("FaceControl", "on_touch_head()");
}
public void on_touch_left_edge() {
Log.d("FaceControl", "on_touch_left_edge()");
}
public void on_touch_right_edge() {
Log.d("FaceControl", "on_touch_right_edge()");
}
public void on_touch_bottom() {
Log.d("FaceControl", "on_touch_bottom()");
}
}
---------------------------------------------------------------------------
Custom Behavior
Target SDK : 2.0.0.05
IMPORTANT :
Robot Face Activity
and Nuwa Launcher Menu
, please make sure no 3rd customize activity cover on foreground.System behavior allows developer to customize response behavior of a NLP result. Developer need to setup Chatbot Q&A from NUWA Trainingkit website which allow developer setup Custom Intention for a sentence. Following sample code will present how to register receive this CustomIntentation notify and implement customize response.
Developer should implement a class to determine how to react when receiving a customized NLP response from NUWA Trainingkit.
This could be achieved by extending from class BaseBehaviorService.
BaseBehaviorService declared three important functions which onInitialize(), createCustomBehavior() and notifyBehaviorFinished().
There are two functions need to be implemented which onInitialize() and createCustomBehavior().
onInitialize()
createCustomBehavior()
notifyBehaviorFinished()
This class has been encapsulated in BaseBehaviorService. Developer can get this instance which is created by BaseBehaviorService.
register(String pkgName, CustomBehavior action)
unregister(String pkgName, CustomBehavior action)
setWelcomeSentence(String[] sentences)
resetWelcomeSentence()
completeCustomBehavior()
The Object which let developer define how to to deal with customized NLP response.
prepare(String parameter)
process(String parameter)
finish(String parameter)
xxxxxxxxxx
public class CustomBehaviorImpl extends BaseBehaviorService {
public void onInitialize() {
handler = new Handler(Looper.getMainLooper());
try {
// TODO initialize
mSystemBehaviorManager.setWelcomeSentence(new String[]{"你好, %s.這是一個歡迎詞的測試!", "%s, 挖底家", "%s, 有什麼可以為您服務的嗎?"});
} catch (RemoteException e) {
e.printStackTrace();
}
}
public CustomBehavior createCustomBehavior() {
return new CustomBehavior.Stub() {
public void prepare(final String parameter) {
// TODO write your preparing work
}
public void process(final String parameter) {
// TODO the actual process logic
// TODO simulate asynchronous task while process complete
handler.postDelayed(new Runnable() {
public void run() {
try {
notifyBehaviorFinished();
} catch (RemoteException e) {
e.printStackTrace();
}
}
}, 5000);
}
public void finish(final String parameter) {
// TODO the whole session has been finished.
}
};
}
}
Safe mode
Make Robot could stand in place at any case. Robot will auto enable "lock wheel mode" while AC is plugged. App could call "unlockWheel()" to disable the safe mode.
Handle Robot Drop event
When Robot drop happening, it will deliver error message and App could receive the error message via RobotEventListener callback
xxxxxxxxxx
// request Robot drop event first
mRobot.requestSensor(NuwaRobotAPI.SENSOR_DROP);
// handle it
public void onDropSensorEvent(int value) {
// value: 1 : drop
0 : normal
// get the amount of drop IR sensor
int val = mRobot.getDropSensorOfNumber();
}
// release sensor event
mRobot.stopSensor(NuwaRobotAPI.SENSOR_NONE);
Handle Robot Service Recovery
When Robot service(Robot SDK) happens unexpected exception, it will restart automaticaly. App could handle the scenario by following callback of RobotEventListener.
xxxxxxxxxx
public void onWikiServiceStart() {
//1. Robot SDK is ready to use now.
//2. Robot SDK restart successfully, and it's ready to use now.
Log.d(TAG, "onWikiServiceStart");
}
@Override
public void onWikiServiceCrash() {
// When Robot service(Robot SDK) happens unexpected exception, it will shutdown itself.
Log.d(TAG, "onWikiServiceCrash");
}
@Override
public void onWikiServiceRecovery() {
// Robot service(Robot SDK) begins to restart itself.
// When it's ready, it will send "onWikiServiceStart" event to App again.
Log.d(TAG, "onWikiServiceRecovery");
}
How to shown entrypoint on Robot Menu
For Robot Generation 2 (Kebbi-Air) Robot-air menu base on Launcher, please implement standard launcher icon. AndroidManifest.xml
xxxxxxxxxx
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".MainActivity"
android:label="@string/app_name"
android:theme="@style/AppTheme.NoActionBar">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
Android Developer reference link https://developer.android.com/guide/topics/manifest/manifest-intro#iconlabel
For Robot Generation 1 (Danny小丹\Kebbi凱比)
xxxxxxxxxx
<activity
android:name=".MainActivity"
android:label="@string/app_name"
android:theme="@style/AppTheme.NoActionBar">
<meta-data
android:name="com.nuwarobotics.app.help.THIRD_VALUE"
android:value="true" />
</activity>
Launch Developer App via Voice command
In Original Android design, App is launched by tapping icon in Launcher. In NUWA Robot we allow use "Voice command" to launch an activity or broadcast an intent to a registered receiver.
*Developer can setup several "Voice Command" separate by ",".
NOTICE: "Voice Command" have to be 100% matched.
xxxxxxxxxx
Ex: in your manifest, add the "intent-filter" of "com.nuwarobotics.api.action.VOICE_COMMAND" in "activity" to identify the activity wants to be launched by Robot Launcher while Robot hears voice commands like [Play competition game,Play student competition game,I want play student game,student competition].
<activity
android:name=".YouAppClass"
android:exported="true"
android:label="App Name"
android:screenOrientation="landscape" >
<intent-filter>
<action android:name="com.nuwarobotics.api.action.VOICE_COMMAND" />
</intent-filter>
<meta-data
android:name="com.nuwarobotics.api.action.VOICE_COMMAND"
android:value="Play competition game,Play student competition game,I want play student game,student competition" />
</activity>
Ex: in your manifest, add the "intent-filter" of "com.nuwarobotics.api.action.VOICE_COMMAND" in "receiver" to identify the receiver wants to be launched by Robot Launcher while Robot hears voice commands like [I want to watch TV,I want to sleep,I want to turn on air conditioner].
<receiver android:name=".VoiceCommandListener" >
<intent-filter>
<action android:name="com.nuwarobotics.api.action.VOICE_COMMAND" />
</intent-filter>
<meta-data
android:name="com.nuwarobotics.api.action.VOICE_COMMAND"
android:value="I want to watch TV,I want to sleep,I want to turn on air conditioner" />
</receiver>
// in VoiceCommandListener.java
public class VoiceCommandListener extends BroadcastReceiver{
@Override
public void onReceive(Context context, Intent intent) {
Log.d(TAG, "action:" + intent.getAction());
if (intent.getAction().equals("com.nuwarobotics.api.action.VOICE_COMMAND")) {
String cmd = intent.getStringExtra("cmd");
Log.d(TAG, "user speak: " + cmd);
}
}
}
Disable Support Always Wakeup
Kebbi-air support "Always wakeup" anywhere. (Kebbi and Danny only support wakeup on Face) We allow App declare on Application or Activity scope.
xxxxxxxxxx
Declare on AndroidManifest.xml
<meta-data android:name="disableAlwaysWakeup" android:value="true" />
Application Scope Example
xxxxxxxxxx
<application
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:theme="@style/AppTheme">
<meta-data android:name="disableAlwaysWakeup" android:value="true" />
</application>
Activity Scope Example
xxxxxxxxxx
<activity
android:name=".YouAppClass"
android:exported="true"
android:label="App Name"
android:screenOrientation="landscape" >
<meta-data android:name="disableAlwaysWakeup" android:value="true" />
</activity>
How to call nuwa ui to add face recognition
App could call nuwa Face Recognition APP to add a face to system. Developer could lunch App by Intent and receive face-id and name from onActivityResult.
xxxxxxxxxx
private final int ACTIVITY_FACE_RECOGNITION = 1;
private void lunchFaceRecogActivity() {
Intent intent = new Intent("com.nuwarobotics.action.FACE_REC");
intent.setPackage("com.nuwarobotics.app.facerecognition2");
intent.putExtra("EXTRA_3RD_REC_ONCE", true);
startActivityForResult(intent, ACTIVITY_FACE_RECOGNITION);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
Log.d(TAG, "onActivityResult, requestCode=" + requestCode + ", resultCode=" + resultCode);
if (resultCode > 0) {
switch (requestCode) {
case ACTIVITY_FACE_RECOGNITION:
mFaceID = data.getLongExtra("EXTRA_RESULT_FACEID", 0);
mName = data.getStringExtra("EXTRA_RESULT_NAME");
Log.d(TAG, "onActivityResult, faceid=" + mFaceID + ", nickname=" + mName);
lunchRoomListActivity(mName);
break;
}
} else {
Log.d(TAG, "unexception exit");
}
}
Setup 3rd App as Launcher app to replace Nuwa Face Activity
Nuwa Face Activity is a Unity Activity to present kebbi face. We setup it as default HOME APP to make sure HOME Key can back to Nuwa Face Activity.
We allow Developer totally replace kebbi face by following implementation.
Step1 : declare 3rd App as a HOME APP.
xxxxxxxxxx
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.HOME" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
Step 2 : Setup 3rd App as default HOME App.
xxxxxxxxxx
1. Open Setting
2. click "About Kebbi" 10 times to enable developer menu
3. click "Home Setting"(啟動器設定)
xxxxxxxxxx
4. "Home app" (ホームアプリ) -> setup to your app
NOTICE : This setting will cause "Nuwa Face Activity" not always shown after HOME KEY pressed. You have to call Face Control API to trigger shown.
Prevent HOME KEY to suspend device
Some business environment request not allow user suspend device by press HOME KEY.
xxxxxxxxxx
1. Open Setting
2. click "About Kebbi" 10 times to enable developer menu 1.
3. "Home screen key for locking screen" (ホーム画面のキーロック画面)
4. disable it to prevent goToSleep when press HOME Key on Nuwa Face Activity.
How to Auto Launch 3rd Activity when bootup
3rd App can declare following intent on Androidmanifest.xml.
Full Example:
xxxxxxxxxx
<activity android:name=".CustomerAutoStartActivity">
<intent-filter>
<action android:name="com.nuwarobotics.feature.autolaunch" />
<category android:name="android.intent.category.DEFAULT" />
</intent-filter>
</activity>
Q & A
Q1: Why it dodn't work while App calls "mRobot.motionPlay("001_J1_Good", true/false)"?
Ans:
[Kebbi Motion list] https://dss.nuwarobotics.com/documents/listMotionFile
Q2: Why App can not control LED?
Ans:
Q3: Why App GUI widget(Ex: Button) can not receive any touch event?
Ans:
Q4: Why App GUI widget(Ex: button) can not receive any touch event after playing a motion?
Ans:
Q5: Does local ASR and cloud ASR support English?
Ans:
Q6: Does local TTS support English?
Ans:
Q7: Why App can not receive Robot touch event?
Ans:
xxxxxxxxxx
mRobot.registerRobotEventListener(new RobotEventListener() {
@Override
public void onTouchEvent(int type, int touch) {
// type: head: 1, chest: 2, right hand: 3, left hand: 4, left face: 5,right face: 6.
// touch: touched: 1, untouched: 0
}
});