[ofa-general] kernel CM question
hrosenstock at xsigo.com
Sat Dec 8 08:41:20 PST 2007
In cm.c, there is:
static inline int cm_convert_to_ms(int iba_time)
/* approximate conversion to ms from 4.096us x 2^iba_time */
return 1 << max(iba_time - 8, 0);
Seems to me that as iba_time gets larger, the approximation is off by
more and forces using the next lower time.
For example, if 22 is used:
ib_cm: req timeout_ms 16896 > 8192, decreasing
Is it too much computation to make this more accurate ?
More information about the general