To start you are looking to show convergence in the sense of a distribution.  So you want 

$$
\int_{-1}^1 f_n(x) g(x)dx \rightarrow g(0)
$$

Now, because integrals are linear and $\int_{-1}^1 f_n(x) dx=1$ we have that 

$$
g(0)=\int_{-1}^1 f_n(x) g(0) dx.
$$

Because you are trying to show convergence you want to look at

\begin{eqnarray*}
\left|\int_{-1}^1 f_n(x) g(x) dx - g(0)\right|= \left|\int_{-1}^1 f_n(x) (g(x)-g(0)) dx\right|
\end{eqnarray*}

Separate the integral into three sections ($\int_{-1}^{-c}$, $\int_{-c}^{c}$, $\int_{c}^{1}$) and use the triangle inequality.  Here is where you would use the uniform convergence to get rid of the integrals over $[-1,-c]$ and $[c,1]$.  All that is left is to show that the integral over $[-c,c]$ is less then some $\varepsilon>0$ arbitrary.