Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue sending & receiving streams between two clients #36

Open
alaaamady opened this issue Dec 13, 2022 · 1 comment
Open

Issue sending & receiving streams between two clients #36

alaaamady opened this issue Dec 13, 2022 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@alaaamady
Copy link

Describe the bug
I'm trying to build on the example app provided, so far I've implemented everything like the example app and I've been successful with connecting to a room on example website, I recieve audio from website, but I don't read the video stream, and I also can't send audio or video at all.

To Reproduce
Steps to reproduce the behavior:

  1. add the following to index.js
import { registerRootComponent } from "expo";
import { registerGlobals } from "livekit-react-native";

import App from "./App";
registerRootComponent(App);
registerGlobals();

  1. Rendering the following component in App.tsx
import { Participant, Room, Track } from "livekit-client";
import {
  useRoom,
  useParticipant,
  AudioSession,
  VideoView,
} from "livekit-react-native";
import { useEffect, useState } from "react";
import { Text, ListRenderItem, StyleSheet, FlatList, View } from "react-native";
import { ParticipantView } from "./ParticipantView";
import { RoomControls } from "./RoomControls";
import type { TrackPublication } from "livekit-client";

const App = () => {
  // Create a room state
  const [, setIsConnected] = useState(false);
  const [room] = useState(
    () =>
      new Room({
        publishDefaults: { simulcast: false },
        adaptiveStream: true,
      })
  );

  // Get the participants from the room
  const { participants } = useRoom(room);
  const url = "[hard-coded-url]";
  const token =
    "[hard-coded-token";
  useEffect(() => {
    let connect = async () => {
      // If you wish to configure audio, uncomment the following:
      await AudioSession.configureAudio({
        android: {
          preferredOutputList: ["speaker"],
        },
        ios: {
          defaultOutput: "speaker",
        },
      });
      await AudioSession.startAudioSession();
      await room.connect(url, token, {});
      await room.localParticipant.setCameraEnabled(true);
      await room.localParticipant.setMicrophoneEnabled(true);
      await room.localParticipant.enableCameraAndMicrophone();
      console.log("connected to ", url);
      setIsConnected(true);
    };

    connect();
    return () => {
      room.disconnect();
      AudioSession.stopAudioSession();
    };
  }, [url, token, room]);
  // Setup views.
  const stageView = participants.length > 0 && (
    <ParticipantView participant={participants[0]} style={styles.stage} />
  );

  const renderParticipant: ListRenderItem<Participant> = ({ item }) => {
    return (
      <ParticipantView participant={item} style={styles.otherParticipantView} />
    );
  };

  const otherParticipantsView = participants.length > 0 && (
    <FlatList
      data={participants}
      renderItem={renderParticipant}
      keyExtractor={(item) => item.sid}
      horizontal={true}
      style={styles.otherParticipantsList}
    />
  );

  const { cameraPublication, microphonePublication } = useParticipant(
    room.localParticipant
  );

  return (
    <View style={styles.container}>
      {stageView}
      {otherParticipantsView}
      <RoomControls
        micEnabled={isTrackEnabled(microphonePublication)}
        setMicEnabled={(enabled: boolean) => {
          room.localParticipant.setMicrophoneEnabled(enabled);
        }}
        cameraEnabled={isTrackEnabled(cameraPublication)}
        setCameraEnabled={(enabled: boolean) => {
          room.localParticipant.setCameraEnabled(enabled);
        }}
        onDisconnectClick={() => {
          //   navigation.pop();
          console.log("disconnected");
        }}
      />
    </View>
  );
};

function isTrackEnabled(pub?: TrackPublication): boolean {
  return !(pub?.isMuted ?? true);
}
const styles = StyleSheet.create({
  container: {
    flex: 1,
    alignItems: "center",
    justifyContent: "center",
  },
  stage: {
    flex: 1,
    width: "100%",
  },
  otherParticipantsList: {
    width: "100%",
    height: 150,
    flexGrow: 0,
  },
  otherParticipantView: {
    width: 150,
    height: 150,
  },
});

export default App;

the components used here are mostly the same as what's in the example, I've removed the screensharing logic and the messages
5. I run the app using an expo development build
6. it will log that it's connected, you'll be able to hear sound from the remote participant, but not see any video or send any sound.
7. if i try to add
await room.localParticipant.enableCameraAndMicrophone();
in the useEffect, I get the following error:

Possible Unhandled Promise Rejection (id: 0):
Error: Not implemented.
getSettings@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:103733:24
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:120307:109
generatorResume@[native code]
asyncGeneratorStep@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21908:26
_next@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21927:29
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21932:14
tryCallTwo@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26656:9
doResolve@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26788:25
Promise@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26675:14
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21924:25
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:120173:52
generatorResume@[native code]
asyncGeneratorStep@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21908:26
_next@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21927:29
tryCallOne@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26648:16
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26729:27
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27687:26
_callTimer@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27602:17
_callReactNativeMicrotasksPass@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27635:17
callReactNativeMicrotasks@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27799:44
__callReactNativeMicrotasks@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21006:46
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20806:45
__guard@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20986:15
flushedQueue@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20805:21
flushedQueue@[native code]

Expected behavior
This should both receive & send video and audio streams between the two clients

Screenshots
image

Device Info (please complete the following information):

  • Device: IphoneX
  • OS: iOS16.0
  • LiveKit Version: beta

Additional context
Normal peer to peer video calling using react-native-webrtc is working fine, so the issue isn't in any native setup of webrtc

@alaaamady alaaamady added the bug Something isn't working label Dec 13, 2022
@davidliu
Copy link
Contributor

What commit of livekit-react-native are you using, alongside the livekit-client and react-native-webrtc dependencies?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants