Use this guide to quickly start the interactive live audio streaming with the Agora Voice SDK for Android.
Before proceeding, ensure that your development environment meets the following requirements:
For new projects, in Android Studio, create a Phone and Tablet Android project with an Empty Activity.
Integrate the Voice SDK into your project with Maven Central. For more integration methods, see Other approaches to intergrate the SDK.
a. In /Gradle Scripts/build.gradle(Project: <projectname>)
, add the following lines to add the Maven Central dependency:
buildscript {
repositories {
...
mavenCentral()
}
...
}
allprojects {
repositories {
...
mavenCentral()
}
}
b. In /Gradle Scripts/build.gradle(Module: <projectname>.app)
, add the following lines to integrate the Agora Voice SDK into your Android project:
...
dependencies {
...
// For x.y.z, fill in a specific SDK version number. For example, 3.5.0 or 3.7.0.2.
// Get the latest version number through the release notes.
implementation 'io.agora.rtc:voice-sdk:x.y.z'
}
Add permissions for network and device access.
In /app/Manifests/AndroidManifest.xml
, add the following permissions after </application>
:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<!-- Add the following permission on devices running Android 12.0 or later -->
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
To prevent obfuscating the code in the Agora SDK, add the following line to /Gradle Scripts/proguard-rules.pro
:
-keep class io.agora.**{*;}
This section introduces how to use the Agora Voice SDK to start the interactive live audio streaming. The following figure shows the API call sequence of the interactive live audio streaming.
Create the user interface (UI) for the audio streaming in the layout file of your project. Skip to Import Classes if you already have a UI in your project.
<?xml version="1.0" encoding="UTF-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
tools:context=".ui.LiveRoomActivity">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:id="@+id/room_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:layout_marginTop="6dp"
android:textColor="@color/dark_black"
android:textSize="16sp"
android:textStyle="bold" />
<io.agora.propeller.ui.AGLinearLayout
android:id="@+id/bottom_container"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentLeft="true"
android:layout_alignParentStart="true"
android:orientation="vertical">
<ImageView
android:id="@+id/bottom_action_end_call"
android:layout_width="54dp"
android:layout_height="54dp"
android:layout_gravity="center_horizontal"
android:onClick="onEndCallClicked"
android:scaleType="center"
android:src="@drawable/btn_endcall" />
<RelativeLayout
android:id="@+id/bottom_action_container"
android:layout_width="match_parent"
android:layout_height="54dp"
android:gravity="center_vertical"
android:orientation="horizontal">
<ImageView
android:id="@id/switch_broadcasting_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:layout_alignParentLeft="true"
android:layout_alignParentStart="true"
android:scaleType="center"
android:src="@drawable/btn_request_broadcast" />
<LinearLayout
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:orientation="horizontal">
<ImageView
android:id="@id/switch_speaker_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:onClick="onSwitchSpeakerClicked"
android:scaleType="center"
android:src="@drawable/btn_speaker" />
<ImageView
android:id="@id/mute_local_speaker_id"
android:layout_width="54dp"
android:layout_height="match_parent"
android:onClick="onVoiceMuteClicked"
android:scaleType="center"
android:src="@drawable/btn_mute" />
</LinearLayout>
</RelativeLayout>
</io.agora.propeller.ui.AGLinearLayout>
<EditText
android:id="@+id/msg_list"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@id/bottom_container"
android:layout_below="@id/room_name"
android:layout_marginBottom="8dp"
android:layout_marginTop="6dp"
android:enabled="true"
android:focusable="false"
android:gravity="start|top"
android:inputType="none"
android:scrollbars="vertical" />
</RelativeLayout>
</FrameLayout>
Import the following classes in the activity file of your project:
import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;
Call the checkSelfPermission
method to access the microphone of the Android device when launching the activity.
private static final int PERMISSION_REQ_ID_RECORD_AUDIO = 22;
// Ask for Android device permissions at runtime.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_voice_chat_view);
// Initialize RtcEngine and join the channel after getting the permission.
if (checkSelfPermission(Manifest.permission.RECORD_AUDIO, PERMISSION_REQ_ID_RECORD_AUDIO)) {
initAgoraEngineAndJoinChannel();
}
}
public boolean checkSelfPermission(String permission, int requestCode) {
Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode);
if (ContextCompat.checkSelfPermission(this,
permission)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this,
new String[]{permission},
requestCode);
return false;
}
return true;
}
Create and initialize the RtcEngine
object before calling any other Agora APIs.
In the string.xml
file, replace agora_app_id
with your App ID. Call the create
method and pass in the App ID to initialize the RtcEngine
object.
You can also listen for callback events, such as when the local user joins the channel. Do not implement UI operations in these callbacks.
private RtcEngine mRtcEngine;
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
@Override
// Listen for the onJoinChannelSuccess callback.
// This callback occurs when the local user successfully joins the channel.
public void onJoinChannelSuccess(String channel, final int uid, int elapsed) {
runOnUiThread(new Runnable() {
@Override
public void run() {
Log.i("agora","Join channel success, uid: " + (uid & 0xFFFFFFFFL));
}
});
}
@Override
// Listen for the onUserOffline callback.
// This callback occurs when the host leaves the channel or drops offline.
public void onUserOffline(final int uid, int reason) {
runOnUiThread(new Runnable() {
@Override
public void run() {
Log.i("agora","User offline, uid: " + (uid & 0xFFFFFFFFL));
onRemoteUserLeft();
}
});
}
};
...
// Initialize the RtcEngine object.
private void initializeEngine() {
try {
mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
} catch (Exception e) {
Log.e(TAG, Log.getStackTraceString(e));
throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
}
}
After initializing the RtcEngine
object, call the setChannelProfile
method to set the channel profile as LIVE_BROADCASTING
.
One RtcEngine object uses one profile only. If you want to switch to another profile, release the current RtcEngine
object with the destroy
method and create a new one before calling the setChannelProfile
method.
private void setChannelProfile() {
mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
}
In a live-streaming channel, you need to set the role and level of a user:
You may use the following steps to set the user role and level in your app:
Allow the end user to set the role as host or audience.
Call the setClientRole
method and set the role
and options
parameters according to the user's choice:
role
as CLIENT_ROLE_BROADCASTER
, set options
as null
. The latency between two hosts is < 400 ms.role
as CLIENT_ROLE_AUDIENCE
, set the audienceLatencyLevel
parameter in options
as AUDIENCE_LATENCY_LEVEL_LOW_LATENCY
. The latency from the host's client to the audience's client is 1500 ms - 2000 ms.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY
to AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY
, in essence, you switch from Interactive Live Streaming Standard to Interactive Live Streaming Premium and the latency from the host's client to the audience's client changes to 400 ms - 800 ms.CLIENT_ROLE_AUDIENCE
to CLIENT_ROLE_BROADCASTER
, in essence, you switch from Interactive Live Streaming Standard to Interactive Live Streaming Premium and the latency between two hosts is < 400 ms.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY
, the jitterBufferDelay
property in RemoteAudioStats
does not take effect.Host:
engine.setClientRole(IRtcEngineEventHandler.ClientRole.CLIENT_ROLE_BROADCASTER);
Audience:
ClientRoleOptions clientRoleOptions = new ClientRoleOptions();
clientRoleOptions.audienceLatencyLevel = isLowLatency ? Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY : Constants.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY;
engine.setClientRole(IRtcEngineEventHandler.ClientRole.CLIENT_ROLE_AUDIENCE, clientRoleOptions);
After setting the user role, you can call the joinChannel
method to join a channel. In this method, set the following parameters:
token
: Pass a token that identifies the role and privilege of the user. You can set it as one of the following values:
uid
is the same with those you use to generate the token.token
as "".channelName
: Specify the channel name that you want to join.
uid
: ID of the local user that is an integer and should be unique. If you set uid
as 0, the SDK assigns a user ID for the local user and returns it in the onJoinChannelSuccess
callback.
mute
methods accordingly.For more details on the parameter settings, see joinChannel
.
private void joinChannel() {
// Join a channel with a token.
private String mRoomName;
mRoomName = getIntent().getStringExtra("CName");
mRtcEngine.joinChannel(YOUR_TOKEN, mRoomName, "Extra Optional Data", 0);
}
You can implement more advanced features and functionalities in the interactive live audio streaming.
Call the muteLocalAudioStream
method to stop or resume sending the local audio stream to mute or unmute the local user.
public void onLocalAudioMuteClicked(View view) {
mMuted = !mMuted;
mRtcEngine.muteLocalAudioStream(mMuted);
}
Call the leaveChannel
method to leave the current channel according to your scenario, for example, when the interactive live streaming ends, when you need to close the app, or when your app runs in the background.
@Override
protected void onDestroy() {
super.onDestroy();
if (!mCallEnd) {
leaveChannel();
}
RtcEngine.destroy();
}
private void leaveChannel() {
// Leave the current channel.
mRtcEngine.leaveChannel();
}
Run the project on your Android device. When the audio streaming starts, all the audience can hear the host in the app.
In addition to integrating the Agora Voice SDK for Android through MavenCentral, you can also import the SDK into your project by manually copying the SDK files.
File or subfolder | Path of your project |
---|---|
agora-rtc-sdk.jar file | /app/libs/ |
arm64-v8a folder | /app/src/main/jniLibs/ |
armeabi-v7a folder | /app/src/main/jniLibs/ |
include folder | /app/src/main/jniLibs/ |
x86 folder | /app/src/main/jniLibs/ |
x86_64 folder | /app/src/main/jniLibs/ |
armeabi-v7a
folder to the armeabi
file of your project. Contact support@agora.io if you encounter any incompability issue.