the issue with video is that the frames aren’t always in perfect sync, even with the same camera model. If they are started at a given moment, one camera may be halfway into or out of a frame, while the other is not. While this isn’t a total processing nightmare, it adds complexity.
First, to make this as simple as possible, you’ll need the same model camera, zoomed to exactly the same focal length (which means only 2 options, all the way in or all the way out; unless you can control them from an app that can give them the same focal length adjustment input). Then you’ll need to make sure you synchronise the shooting (again 2 options: you can use a timecode calculation and clock to set them to start at the same time or a frame apart, so they line up exactly; or you can simply synchronise them by audio if you have the same source –by lighting values in certain areas if you have them color correct and own a the program that will synchronise by the lighting at the common edges, and then cut them down to the same exact beginning and end, nesting each out of the sync sequence and into separate output sequences for export to a new video file; these two options work, the second because it will extend frames at ends to make them line up when you cut them at the same frame of the sequence). Even with all that, you now have two video files and there’s one more thing you need to worry about… …THE SHOOTING ANGLE. Both cameras need to shoot along the same 180 degree plane while 1\3 of the frames overlaps.
Here’s where it can get interesting and long. I havent yet seen the panorama function available on it’s own in AE. I have seen it in photoshop, and I’ve even automated it using a photoshop action for short sets of files. What you’ll need to do is output each video to frames, then automate the stitching of each set of frames for an actual stitching.
There’s a second alternative not many people talk about. Set up a comp that is sized 2x-(1\3x) where x is the pixel width of your video. Set each video to the side, and line them up, then the one that overlays on top of the other, you’ll need to remove 1\6th to 1\4 of the entirety of that video size from the overlap area. This is where it would distort. Now all you need to do is apply any warp or straight line adjustment, and they should line up nicely. Try to cut the top one at an area where there aren’t any finely detailed objects or people at least most of the time. Where people actually cross, you may see distortion, go to those frames, and you’ll have to selectively add the top overlay on the first few of their crossing, until they pass into the undervideo.
This is the general process I’m afraid. Time consuming, hard work, but worth it for meshing large frames together for crystal clear extended video. Another rule of thumb:
Instead of overlapping the photographic 1\3 or so, you’ll want to overlap at least two areas of minimal detail\action along the axis of combination (the horizontal for panoramic horizontal stretching, the vertical for stretching along the vertical). This can really cause more harm to the size you can utilize. However, I’ve seen it done, and I’m not advocating going out and doing it just describing the experience, where 6-12 1080i cameras were used, all forcing opposing interlace (one camera would cross half into another for visual width but be flipped upside down and the next right side up and so on, in a multirow fashion. The furthest outlying range of the video was single field only, but at 3\4 the single-cam size in, there was a perfect matchup of two fields, and was output from a single comp as full frames in jpeg, before being wrapped into a comp of 60fps progressive frame data for each row. When they were overlayed and stitched, they yielded between 6 and 8k Progressive frame video when placed into a sequence in premiere, which was later used for punch in at 2k. I was shocked as hell to see it. They guy called the aparatus his “Camera Wall”, and it used a tripod with a Cbracket screwed to it, with another mono-bar attatched to the top of the bracket, with another cbracket, and cameras placed at regular intervals of 4-6 feet, across longbars on top of 4 tripods. The rendering alone took nearly a week, but it worked.