The project aims to explore the use of deep neural networks in the field of machine translation (MT). The project consists of two parts. The first part will extend the current popular topic of word embeddings, i.e. representation of words in contiguous space. We will equip the recently published SkipGram model with morphological information. The final model will be evaluated on English and Czech datasets. The second part will utilize vector representation of words in the field of machine translation. We will use our model in several neural network architectures like Feedforward NN, LSTM or convolutional networks. The best performing architecture will be used for scoring and reranking of translation candidates. We will also employ the model in an end-to-end neural network MT system.