Aaron Star
Forum Replies Created
-
See if this app runs.
https://wiki.luxcorerender.org/LuxMark_v2
if it does and it uses the GPU+CPU to render, then you will need to contact Support to determine why Vegas is not seeing the GPU.
-
Questions:
What is “media info”(its an app) for the media on the timeline, and what is the source that is generating it?
Can you post a snap shot of your project settings?
Is the 5800 GPU a 5850 or 5870? if you do not know something like GPU-Z or Speccy can help with this.
Are there any other additional cards installed in the PC?
Is your dynamic ram setting under preferences set to default?
You should really do some research on how changing the Preview selector is impacting your playback. You should do most editing in Preview Auto or preview Half. The difference between good and Best applies different scalers, and requires the pc to be more ridged on doing all the math for every frame (even if the preview window is scaled below HD.) Stable playback is more desirable than pristine footage when editing. Rest assured that once rendered that the quality will be there. You can switch to Best FULL to get a full representation when judging focus or graphics.
- Preview Auto will adjust to the preview window size and render what is needed.
- Best Full asks that all frames be rendered to full res no matter what the preview window size is, while using the most complex scaler that is more mathematically challenging.
60P HD will be 2x as hard to playback as 30p, seems people do not really understand this. The project media and settings all need to be in alignment, and the monitor refreshing faster than 60HZ. If not Vegas will attempting to compensate which generally means dropped frames (poor playback.)
Memory: most people only look at used vs free memory. Windows 8-10 have more modern memory architecture and can use excess RAM as caching. Caching speeds up access to files most utilized. What files in your Vegas project/timeline are most utilized and how big are they?
Here are some other things to try but they are complex:
1- Disable window 10 memory compression if you have 16GB or more. This will eliminate overhead.
https://superuser.com/questions/1000485/how-to-disable-windows-10-memory-compression2- Verify your GPU is operating at 16X interface speeds, and not sharing bandwidth with another addon board.
3 – Verify your memory bandwidth is operating at optimal speed by executing “WinSAT mem” from and admin command prompt.
Beyond this, you may need a bigger horse to carry your workload.
-
This probably depends on how well configured your current config is running?
Memory tips are:
Run the max freq your system can run with in a stable fashion. (i7-3820 – DDR3 1066/1333/1600 – some boards support higher overclocking=instability Test Test Test)
Single Rank memory is better
Most desktop boards will not operate in dual-channel mode with all slots filled. Especially if any DIMMs are different types. Run WinSAT mem and verify your memory bandwidth is operating at dual channel mode.
Windows will use extra memory as cache. You can read info from disk at couple hundred MB/s, or pull from memory at 50GB/s with a lot lower latency.
Memory bandwidth is the most noticeable stat, with system stability being king. You should be able to run MemTest through 3-10 complete passes with ZERO errors.
I think if you follow the first two suggestions, you will quickly see the limits of your current architecture.
-
Have you tried converting a sample of your source media to another format like .MXF AVC, or another compatible format for the resolution?
The test the converted media in a clean project?
-
Not exactly sure what the parameters that Vegas checks for, but starting the Vegas 10 the 5770 was the base supported GPU. Comparing the stats against the 5770, the NV1060M should rank well in performance.
You might want to verify that Luxmark runs correctly in GPU+CPU mode.
If Luxmark does not see the GPU as an available OpenCL compute unit, then you might have to resolve driver problems.
-
Can you post media info on your camera original footage? A long with details on the project settings in vegas and render output?
-
As someone said earlier in the thread, it is hard to offer suggestions with out details on specific hardware, Vegas version, project setting, media format info, and details on effects being used, ect.
For cuts only editing CPU is king with a certain amount assist from the GPU. For editing that contains computation heavy effects like blurs, the GPU would show performance improvement.
You could trying converting your source media to another format, then test renders using that format to see if there is optimization between formats. For example, convert XAVC-s to Intra format, or xavc to prores, or h264 camera phone original to one of the previsious mentioned formats.
-
Maybe look at Intel, Asus, Gigabyte products? Not really sure, the issue apparently was discovered year and half ago, and they are only implicating Supermicro. Could be that we just do not know about the other manufacturers yet, since everything is made in China these days.
If you start seeing other large manufacturers recalling hardware/PCs in mass, that is a likely indicator.
If you hear of no other recalls or alerts, then probably it was limited to Supermicro.
-
Might want to avoid the SuperMicro Motherboard and choose another.
-
Here are some of the issue I see with that config:
16XDIMMS – the CPUs only have 12 memory channels. You want to find out how that board handles memory inserted in to the system, this will determine the number of DIMMs you need.
Dual CPUs might seem like the ultimate, but for Vegas you may want to play around with limiting your instance to a single CPU with affinity. I am pretty sure the Vegas developers do not test Vegas on such high end hardware. When you get into dual CPUs, you are looking at running software that is application specific and is coded to be optimized to run on that hardware. Windows will give Vegas the threads and resources, but the application code may not be optimized to handle it correctly.
Multiple GPUs – Yeah I guess if you have that kind of money to invest, and another application that needs it. Resolve will likely utilize the multiple GPUs, not so sure the Vegas code is up to speed to utilize multiple GPUs. I guess if you are planning on running say 3 4K monitors at high color depth then multiple GPUs would pay off. If you have a 3D rendering work load that can utilize multiple GPUs, then yes as well.
Storage – If you have xeon level cash to toss at this problem, you should really do some research on the differences between SATA, SAS, and NVME. With such a high end configuration, you would want to follow my workflow idea of using separate archive and working storage. Working storage would be something like an Intel 750 SSD, and the OS on an 970Pro M.2. Then back up your system regularly to an LTO-8, or multiple offline external HDDs formatted REFS. Multiple external HDD, because those suckers come up dead or bit corrupted over time.
Raid is not dead, but in a SSD world, backup is better. NVME is faster than raid on SATA, and you want the lowest latency storage possible with all the memory bandwidth speeds across the rest of the system.
I forgot to add that up to Vegas 14, I cant speak to 15 and 16, media codecs and formats matter in Vegas. Media formats in Sony MXF formats seems to be more stable and optimized. Other formats like mov and avid codecs do not seem to work as well in Vegas. Even still image formats can make a difference when dealing large amounts of them.
Just some of my thoughts.