I'm remoting in for data visualization work (mostly relies on WebGL). Local and remote are both Win10. Local is laptop w/ integrated Intel GPU and remote uses a GeForce 1030 GT. When I run the visualizations, I see GPU load and dedicated memory being used on both the remote and local PCs (local by the RDP process) through Process Hacker.
I'm wondering if upgrading the GPU is worth it on either the local or remote to get a better performance, as the current setup is pretty slow.