Can someone help me figure out what it even is I'm trying to do? I'm a tech savvy kinda persons and if someone just gives me the general idea/right keywords to search for I can probably figure the rest out myself, but I'm caught in a real X/Y problem.
JUNK: Arch, KDE (X11), 3080 (proprietary drivers), OBS, Elgato HD60 X, 3440x1440 ultra widescreen
I just want to do some simple streaming to Twitch/Youtube and game recording.
The Elgato obviously doesn't support my ultrawide so my original thought was to leave the UW monitor plugged in with DisplayPort (as it already is) and then plug in the Elgato with HDMI and then switch the monitor input when I'm ready to stream. The UW stretches the 2560x1440 out though, how do I configure the viewport to keep the proper aspect ratio and put black bars on the side? Alternatively, can I configure the UW to 2560x1440 with black bars and simply mirror the display, or will I take a performance hit when streaming like that? And how do I change the xconfig on the fly, is that something I'd want to write a script for?
I inherited the Elgato from a friend who gave up on streaming and while I'm not entirely opposed to spending more money on potentially more appropriate gear ....... I'd really rather not.
Like I said, if someone can just explain to me what I should be doing and give me a swift kick in the ass towards the right direction, I can do the heavy work of putting all the pieces together, I'm not looking for a total solution π΅π΅βπ« Thanks!
....... I thought that ... nevermind, this is why I'm here.
The Elgato has a USB coming out of it and I thought that passing everything through it would allow the USB to feed/write the video stream without any other processing, I guess what I've really been after this whole time is more OBS tweaking.
I think this was a big missing piece for me.
For all my years in IT, I've never been an A/V nerd.>
Unfortunately no. It captures the signal and turns it into something that the computer can digest, but the signal isn't something that just proxies straight through to twitch. OBS is always going to do some re-rendering.
A few tips:
If you open OBS settings, there is a "Output" section. You can change the output mode to "Advanced", and then select a "Video Encoder" ... this is where you would find NVENC (there might be a way to do it in the simple output mode too, but I dont have an nvidia GPU to confirm.
You'll most likely want to change the Output resolution on the "Video" section of the settings down to 1280x720. Twitch limits your bandwidth anyway, and people tend to find that 1080p at low bandwidth doesn't look any better than 720p at the same bandwidth (less compression artifacts because it doesnt have to compress as much, if at all)
Twitch has an option for bandwidth tests (or at least used to). This will make their servers accept the stream, but you don't actually go live on the site. You can use this to see how your computer handles the streaming. On the main OBS dashboard, you'll see a
30.00 / 30.00 FPS
in the bottom right corner (or whatever your resolution you've selected). There's also a CPU meter down there.In the Docks menu there's also a Stats dock. It will tell you how many Frames are missed due to rendering or encoding lag. If you have 0 missed frames, then your PC is handling the encoding just fine. It will also list how many dropped frames due to NETWORK you've had. This would indicate that there is a problem between you and Twitch/Youtube on the internet. Your computer is rendering the frames just fine, but Twitch isn't receiving them.
Use the stats dashboard to figure out where you are losing frames and then fix that (if its rendering/encoding, then its NVENC or your CPU struggling. if its Network, then its your ISP struggling). And if you aren't losing frames, then you have nothing to worry about. This dashboard will also show you CPU and memory usage, but realistically, if youre using a 3080 with nvenc, those usages will probably be very low.
You are correct that the Elgato does video encoding. And that if you use your GPU it's putting a little bit of extra load on the GPU. But it's negligible since the video encoding is a separate part of the chip. Maybe you'll lose a percentage of FPS due to power usage snd bandwidth, but honestly the same is probably true for the CPU load caused by USB bandwidth.
Even with the elgato doing "video encoding", how does it get to Twitch/Youtube? It doesn't do THAT kind of encoding. It's encodes the HDMI capture into a local format that is basically a webcam stream. It has to be broadcast from OBS. and even if you are using the Elgato as a video source, OBS is going to re-encode it into what it wants to broadcast. There isn't really getting around the video encoding cost of OBS, unless you have a device that streams to the internet directly from the capture card (which it doesn't seem like Elgato makes one. Someone else might, but that's not really what they are for)
Ah I did some more research and what I said only applies to the older Elgato devices. They did use h264 as the format over usb and you could use that directly without recoding. But they moved to a custom format due to delay and decoding overhead. And ofc you'd want stream ovelays and such which also requires reencoding.