Combining Fitts’ law with Hick’s law

I'm working through a textbook and got stumped on this particular question:

Use Hicks’ and Fitt’s Laws to derive an expression for the time for a user to select an item in a menu where b is the branching factor, the number of alternatives at each level, and n is the total number of options in the full menu. Assume the distance to move and target size are independent of b.

What does your expression predict would be the optimal choice for b in order to minimize the overall selection time?

If I define Fitts law as: T = Im*log_2(2d/s) where T is the time to acquire target of size s at distance d.

And Hicks as: Td = Ic*log_2(n+1) for n equally probable alternatives.

Would my answer simply be: total time = b*(Ic*log_2(n+1)) + (Im*log_2(2d/s))?

Or maybe: total time = (Ic*log_2(n+1/b)) + (Im*log_2(2d/s))?