Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Things fired with 75+ effective precision do not deviate.
Everything else rolls a d100, if they roll a 1 or 2 they divide precision by 3, if they roll 3-6 inclusive they divide precision by 2.
This thread goes into a lot of other stuff and I posted a mockup of what I think the deviation function looked like when I reverse engineered it, but the kinda big thing is that accuracy falls off a cliff when shooting beyond ((prec/2) -2) squares if your precision is less than 10, or ((((prec*2)-10)/2)-2) squares if over 10.
Practically that means that each point of precision over 10 gives you one more square at near perfect accuracy before things start to go really wrong really quickly. I posted some poorly labelled plots in the thread above somewhere I think.
"Huge" is subjective. While there's randomness in there (including things that aren't in the manual, there aren't (well, when I looked) any openended 2d6s in deciding where something ranged is going to land.
The formula is Precision/2 -2
The parentheses hold just the same formula in text form, they're not part of the formula.
And thanks for the other replies, glad to know I was just misreading it after all.