Building a Video Chat App Using React Hooks and Agora

Ekaansh Arora
5 min readApr 8, 2021

With React 16.8, we now have access to hooks in functional components. I love this new way of writing React components. We’ll use the React wrapper library for the Agora Web NG SDK to make it easier than ever to build a video calling app in React.

The Agora Web SDK makes the task of building a real-time engagement application painless. You don’t have to worry about latency, scale, or compatibility. The SDK uses an event-driven approach to handle users and their videos. The React wrapper gives you methods to create hooks that make accessing the client object and tracks object simple. We’ll discuss this in detail below.

Creating an Account with Agora

Sign up for an account and log in to the dashboard.

The project management tab on the website

Navigate to the Project List tab under the Project Management tab, and create a project by clicking the blue Create button. (When prompted to use App ID + Certificate, select only App ID.) Retrieve the App ID, which will be used to authorize your requests while you’re developing the application.

Note: This guide does not implement token authentication, which is recommended for all RTE apps running in production environments. For more information about token-based authentication in the Agora platform, see this guide: https://docs.agora.io/en/Video/token?platform=All%20Platforms

Prerequisites

Initialize the Project

You can get the code for the example on GitHub, or you can create your own React project. You can use any toolchain of your choice. Here’s how to create a project using create-react-app, which gives you a new React project (my-app) with Typescript (optional). Open a terminal and execute:

npx create-react-app my-app --template typescript
cd my-app

Install the wrapper library using git, which installs the NG SDK as well:

npm install git://github.com/AgoraIO-Community/agora-rtc-react#v1.0.0

That’s it. You can now run npm start to start the server and see the preview.

Structure of the Project

This is how your project directory should look like

.
├── node_modules
├── public
├── src
│ └── App.tsx
│ └── index.css
│ └── index.tsx
│ └── ....
├── package.json
...
.

Writing the Code

App.tsx

You can import and use the wrapper alongside the SDK. The wrapper gives you access to methods that you can use to create hooks. These hooks can be used to access the client object or the stream object in any part of your application. More on what these objects do in a bit.

We define the config for the client. We’re setting the mode as rtc for real-time communication. You can also use live for livestream mode. We’re using the vp8 codec, but you can also use h264. Copy and paste the App ID that you generated from the Agora Console. If you’re using tokens, you’ll have to set the token value here (you can obtain a temporary token from the project menu).

App Component

In the App component that gets rendered on page load, we have two subcomponents: <VideoCall> for the video call and <ChannelForm> for the user to input their channel name.

Getting Hooks from the Wrapper

We can use the createClient method with the config to get a hook, which we’re calling useClient. The useClient hook gives us the client object, which is equivalent to AgoraRTCClient from the NG SDK.

The createMicrophoneAndCameraTracks method gives us a hook that we’re calling useMicrophoneAndCameraTracks. This hook gives us access to a ready state variable and an array of tracks. The ready variable is false by default. Once the tracks are initialized and ready to use, the ready variable is set to true by the wrapper. The tracks object is null by default. Once the tracks are ready, it contains an array of [IMicrophoneAudioTrack, ICameraVideoTrack].

Note: We use the create methods outside the App component at the top of the file.

VideoCall Component

The VideoCall component takes in setInCall (used to start/end the call) and channelName (users on the same channel can communicate with each other) as props. Using the useState hook, we define the array users, which stores a list of remote users, and a start variable that is set to true when we’ve joined the call.

We call the useClient hook. It will create a client if none exists and return the same client each time throughout the application. We also call the useMicrophoneAndCameraTracks and get the ready variable and the tracks array.

Initializing the Client

In the VideoCall component, we’ll use the useEffect hook to initialize the client so that the function isn’t called every render cycle.

We define an init function where we attach event listeners for SDK methods. On the user-published event, we add the remote user to the users state array. If we get the user-unpublished event, we remove the user from the state array, as we do with the user-left event. We use the join method to join the RTC channel using the App ID, channel name, and token while setting the UID to null (the SDK assigns the UID). We also publish the local tracks using the publish method. Finally, we set the start variable to true. We call the init function once the tracks are ready.

We render two subcomponents: <Controls> shows us buttons to mute/unmute the audio/video and to leave the call. <Videos> takes in the users array and the track array to render the videos.

Videos Component

We use the <AgoraVideoPlayer> component to render the user videos. The component takes in the videoTrack as a prop. You can pass in other props, which get passed to the div containing the video tracks, such as style or className.

Controls Component

We use the useClient hook to access the client object in the controls component. We define a function mute to mute/unmute the audio/video using the setEnabled method on the track, and we store the state of the local audio/videos in a state array to change the button style. We also define a function to leave the channel. We use the leave method to exit the channel and use the removeAllListeners method to remove the event listeners (avoiding memory leaks). We free up the hardware resources (camera and mic) using the close method on our tracks. Finally, we set the state variables start and inCall to false.

Note: The cleanup is handled by the user.

ChannelForm Component

We render a simple form to take in the channel name as input from the user before starting the call.

index.css

We write some basic CSS styles to make our app look nice.

Note: The vid class that is used with <AgoraVideoPlayer> must have explicit height. Otherwise, the div containing the video has zero height by default.

Conclusion

That’s how we can use the React wrapper and hooks to make it easy for us to use the NG SDK in a React app without having to worry about creating a client or track and persisting it throughout the application. You can check out the NG SDK API reference and extend this app with any feature you want. You can use this basic video chat app and change a few lines of code for live audio/video streaming.

--

--

Ekaansh Arora

Developer Evangelist, Agora. Passionate about combining music and art with code and creating interactive experiences.