Use this guide to quickly start the interactive live audio streaming with the Agora Voice SDK for Android.
The following figure shows the workflow to integrate into your app in order to add Interactive Live Streaming Premium functionality.
As shown in the figure, the workflow for adding Interactive Live Streaming Premium in your project is as follows:
Set the client role
Each user in an Interactive Live Streaming Premium channel is either a host or an audience member. Hosts publish streams to the channel, and the audience subscribe to the streams.
Retrieve a token
A token is the credential that authenticates a user when your app client joins a channel. In a test or production environment, your app client retrieves tokens from a server in your security infrastructure.
Join a channel
Call joinChannel
to create and join a channel. App clients that pass the same channel name join the same channel.
Publish and subscribe to audio and video in the channel
After joining a channel, app clients with the role of the host can publish audio and video. For an auidence memeber to send audio and video, you can call setClientRole
to switch the client role.
For an app client to join a channel, you need the following information:
Before proceeding, ensure that your development environment meets the following requirements:
For new projects, in Android Studio, create a Phone and Tablet Android project with an Empty Activity.
Integrate the Voice SDK into your project with Maven Central. For more integration methods, see Other approaches to intergrate the SDK.
a. In /Gradle Scripts/build.gradle(Project: <projectname>)
, add the following lines to add the Maven Central dependency:
buildscript {
repositories {
...
mavenCentral()
}
...
}
allprojects {
repositories {
...
mavenCentral()
}
}
b. In /Gradle Scripts/build.gradle(Module: <projectname>.app)
, add the following lines to integrate the Agora Voice SDK into your Android project:
...
dependencies {
...
// For x.y.z, fill in a specific SDK version number. For example, 3.5.0 or 3.7.0.2.
// Get the latest version number through the release notes.
implementation 'io.agora.rtc:voice-sdk:x.y.z'
}
Add permissions for network and device access.
In /app/Manifests/AndroidManifest.xml
, add the following permissions after </application>
:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<!-- Add the following permission on devices running Android 12.0 or later -->
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
To prevent obfuscating the code in the Agora SDK, add the following line to /Gradle Scripts/proguard-rules.pro
:
-keep class io.agora.**{*;}
This section introduces how to use the Agora Voice SDK to start the interactive live audio streaming. The following figure shows the API call sequence of the interactive live audio streaming.
Create the user interface (UI) for the audio streaming in the layout file of your project. Skip to Import Classes if you already have a UI in your project.
You can also refer to the xml files under the layout path in the OpenLive-Voice-Only-Android demo project.
<?xml version="1.0" encoding="UTF-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
tools:context=".ui.LiveRoomActivity">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:id="@+id/room_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:layout_marginTop="6dp"
android:textColor="@color/dark_black"
android:textSize="16sp"
android:textStyle="bold" />
<io.agora.propeller.ui.AGLinearLayout
android:id="@+id/bottom_container"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentLeft="true"
android:layout_alignParentStart="true"
android:orientation="vertical">
<ImageView
android:id="@+id/bottom_action_end_call"
android:layout_width="54dp"
android:layout_height="54dp"
android:layout_gravity="center_horizontal"
android:onClick="onEndCallClicked"
android:scaleType="center"
android:src="@drawable/btn_endcall" />
<RelativeLayout
android:id="@+id/bottom_action_container"
android:layout_width="match_parent"
android:layout_height="54dp"
android:gravity="center_vertical"
android:orientation="horizontal">
<ImageView
android:id="@id/switch_broadcasting_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:layout_alignParentLeft="true"
android:layout_alignParentStart="true"
android:scaleType="center"
android:src="@drawable/btn_request_broadcast" />
<LinearLayout
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:orientation="horizontal">
<ImageView
android:id="@id/switch_speaker_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:onClick="onSwitchSpeakerClicked"
android:scaleType="center"
android:src="@drawable/btn_speaker" />
<ImageView
android:id="@id/mute_local_speaker_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:onClick="onVoiceMuteClicked"
android:scaleType="center"
android:src="@drawable/btn_mute" />
</LinearLayout>
</RelativeLayout>
</io.agora.propeller.ui.AGLinearLayout>
<EditText
android:id="@+id/msg_list"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@id/bottom_container"
android:layout_below="@id/room_name"
android:layout_marginBottom="8dp"
android:layout_marginTop="6dp"
android:enabled="true"
android:focusable="false"
android:gravity="start|top"
android:inputType="none"
android:scrollbars="vertical" />
</RelativeLayout>
</FrameLayout>
Import the following classes in the activity file of your project:
import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;
Call the checkSelfPermission
method to access the microphone of the Android device when launching the activity.
private static final int PERMISSION_REQ_ID_RECORD_AUDIO = 22;
// Ask for Android device permissions at runtime.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_voice_chat_view);
// Initialize RtcEngine and join the channel after getting the permission.
if (checkSelfPermission(Manifest.permission.RECORD_AUDIO, PERMISSION_REQ_ID_RECORD_AUDIO)) {
initAgoraEngineAndJoinChannel();
}
}
private void initAgoraEngineAndJoinChannel() {
initializeAgoraEngine();
joinChannel();
}
public boolean checkSelfPermission(String permission, int requestCode) {
Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode);
if (ContextCompat.checkSelfPermission(this,
permission)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this,
new String[]{permission},
requestCode);
return false;
}
return true;
}
Create and initialize the RtcEngine
object before calling any other Agora APIs.
In the string.xml
file, replace agora_app_id
with your App ID. Call the create
method and pass in the App ID to initialize the RtcEngine
object.
You can also listen for callback events, such as when the local user joins the channel. Do not implement UI operations in these callbacks.
private RtcEngine mRtcEngine;
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
@Override
// Listen for the onJoinChannelSuccess callback.
// This callback occurs when the local user successfully joins the channel.
public void onJoinChannelSuccess(String channel, final int uid, int elapsed) {
runOnUiThread(new Runnable() {
@Override
public void run() {
Log.i("agora","Join channel success, uid: " + (uid & 0xFFFFFFFFL));
}
});
}
@Override
// Listen for the onUserOffline callback.
// This callback occurs when the host leaves the channel or drops offline.
public void onUserOffline(final int uid, int reason) {
runOnUiThread(new Runnable() {
@Override
public void run() {
Log.i("agora","User offline, uid: " + (uid & 0xFFFFFFFFL));
onRemoteUserLeft();
}
});
}
};
...
// Initialize the RtcEngine object.
private void initializeEngine() {
try {
mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
} catch (Exception e) {
Log.e(TAG, Log.getStackTraceString(e));
throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
}
}
After initializing the RtcEngine
object, call the setChannelProfile
method to set the channel profile as LIVE_BROADCASTING
.
One RtcEngine object uses one profile only. If you want to switch to another profile, release the current RtcEngine
object with the destroy
method and create a new one before calling the setChannelProfile
method.
private void setChannelProfile() {
mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
}
A live-streaming channel has two user roles: BROADCASTER
and AUDIENCE
, and the default role is AUDIENCE
. After setting the channel profile to LIVE_BROADCASTING
, your app may use the following steps to set the client role:
BROADCASTER
or AUDIENCE
. setClientRole
method and pass in the user role set by the user.Note that in the live streaming, only the host can be heard. If you want to switch the user role after joining the channel, call the setClientRole
method.
public void onClickJoin(View view) {
// Show a dialog box to choose a user role.
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setMessage(R.string.msg_choose_role);
builder.setNegativeButton(R.string.label_audience, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
MainActivity.this.forwardToLiveRoom(Constants.CLIENT_ROLE_AUDIENCE);
}
});
builder.setPositiveButton(R.string.label_broadcaster, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
MainActivity.this.forwardToLiveRoom(Constants.CLIENT_ROLE_BROADCASTER);
}
});
AlertDialog dialog = builder.create();
dialog.show();
}
// Get the user role and channel name specified by the user.
// The channel name is used when joining the channel.
public void forwardToLiveRoom(int cRole) {
final EditText v_room = (EditText) findViewById(R.id.room_name);
String room = v_room.getText().toString();
Intent i = new Intent(MainActivity.this, LiveRoomActivity.class);
i.putExtra("CRole", cRole);
i.putExtra("CName", room);
startActivity(i);
}
// Pass in the role set by the user.
private int mRole;
mRole = getIntent().getIntExtra("CRole", 0);
private void setClientRole() {
mRtcEngine.setClientRole(mRole);
}
After setting the user role, you can call the joinChannel
method to join a channel. In this method, set the following parameters:
token
: Pass a token that identifies the role and privilege of the user. You can set it as one of the following values:
uid
is the same with those you use to generate the token.token
as "".channelName
: Specify the channel name that you want to join.
uid
: ID of the local user that is an integer and should be unique. If you set uid
as 0, the SDK assigns a user ID for the local user and returns it in the onJoinChannelSuccess
callback.
mute
methods accordingly.For more details on the parameter settings, see joinChannel
.
private void joinChannel() {
// Join a channel with a token.
private String mRoomName;
mRoomName = getIntent().getStringExtra("CName");
mRtcEngine.joinChannel(YOUR_TOKEN, mRoomName, "Extra Optional Data", 0);
}
Call the leaveChannel
method to leave the current channel according to your scenario, for example, when the streaming ends, when you need to close the app, or when your app runs in the background.
@Override
protected void onDestroy() {
super.onDestroy();
if (!mCallEnd) {
leaveChannel();
}
RtcEngine.destroy();
}
private void leaveChannel() {
// Leave the current channel.
mRtcEngine.leaveChannel();
}
You can find the complete code logic in the OpenLive-Voice-Only-Android demo project.
Run the project on your Android device. When the audio streaming starts, all the audience can hear the host in the app.
In addition to integrating the Agora Voice SDK for Android through MavenCentral, you can also import the SDK into your project by manually copying the SDK files.
File or subfolder | Path of your project |
---|---|
agora-rtc-sdk.jar file | /app/libs/ |
arm64-v8a folder | /app/src/main/jniLibs/ |
armeabi-v7a folder | /app/src/main/jniLibs/ |
include folder | /app/src/main/jniLibs/ |
x86 folder | /app/src/main/jniLibs/ |
x86_64 folder | /app/src/main/jniLibs/ |
armeabi-v7a
folder to the armeabi
file of your project. Contact support@agora.io if you encounter any incompability issue.