The first VC bound involves the growth function, which depends on the training sample size. In order to explicit this dependency in the capacity measure, we will here instead use the VC-dimension.
This new bound relies on a combinatorial result known as Sauer's Lemma and which allows us to bound the growth function with the VC-dimension.
In a binary classification setting with a given function class $\F$ of VC-dimension $\VC(\F)$, we have the following bound on the risk $R(f)$ of any classifier $f\in\F$.
For all $N>\VC(\F)/2$ and all $\delta > 0$, the bound $$ \forall f\in\F,\quad R(f) \leq R_{emp}(f) + 2\sqrt{\frac{2}{N}\left( \VC(\F) \log \frac{2 e N }{{\small\VC(\F)}} + \log\frac{2}{\delta} \right) } $$ holds with probability at least $1-\delta$.
Note: $e=\exp(1)$ in the above.Proof: This result is a direct consequence of the first VC bound and Sauer's Lemma applied with $n=2N$.