Some people can mix two languages within the same sentence: this is known as intra-sentential code-switching. Themajority of computational models on language comprehension are dedicated to one language. Some bilingual modelshave also been developed, but very few have explored the code-switching case. We collected data from human subjectsthat were required to mix pairs of given sentences in French and English. Truly bilingual subjects produced more switcheswithin the same sentence. The corpus obtained have some very complex mixed sentences: there can be until elevenlanguage switches within the same sentence. Then, we trained ResPars, a Reservoir-based sentence Parsing model, withthe collected corpus. This Recurrent Neural Network model processes sentences incrementally, word by word, and outputsthe sentence meaning (i.e. thematic roles). Surprisingly the model is able to learn and generalize on the mixed corpus withperformances nearly as good as the unmixed French-English corpus.