Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Can you try testing streaming game to another PC? very well could be just issue with Mac Os.
Did you try updating the drivers on both Game PC and Mac?
Try setting on game in Nvidia control panel for both PC/Mac to run at performace.
(in Nvidia control panel -> Adjust image setting with preview -> Change that to performance)
Then try stream to your macbook pro.. If not then try changing both on MBP and PC
to Quality) Maybe that will help?
Also the Macbook.. Is using Intel onboard for steam or is using the Nvidia GT gpu?
There is setting to switch (I think) from onboard to Nvidia Control panel.
See if it set to intel or if using Nvidia GPU..
Hope that helps some..
Good luck
-Sire
Game Pc: i7 3770k 3.5 Ghz/16 GB Ram/GTX 980TI SC/2x 240 SSD GB/Win 7 64bit
Ark Server: i5 4960 3.5 Ghz/16 GB Ram/No GPU/240 GB SDD/Win7 64Bit
(Click my profile for serv info)
Others with the same rMBP as the client report the exact same issue, so I suspect it is a bug with the Nvidia GPU decoding under Mac OS.
http://steamcommunity.com/groups/homestream/discussions/1/208684375410952584/
Steam bypasses any iGPU/dGPU setting I choose with gfxCardStatus. (www.gfx.io)
When I enable hardware decoding in steam, it will always activate the Nvidia dGPU whenever I start to stream a game, even when the system is forced to iGPU only with this app.
Annoyingly steam will even switch on the dGPU when I have hardware decoding disabled. As a result the dGPU heats up the machine even though it is completely idle and the CPU does all the decoding!
I wonder if steam developers are reading this? Is there a bugtracker that I can submit this to?
I never have Steam disregard my integrated only setting with this version. I do with the official release.