Electrophysiological research has shown that pseudowords elicit more negative Event-Related Potentials (i.e., ERPs) than words within 250ms after the lexical status of a speech token is defined (e.g., after hearing the onset of “ga” in the Spanish word “lechuga”, versus “da” in the pseudoword “lechuda”). Since lip-read context also affects speech sound processing within this time frame, we investigated whether these two context effects on speech perception operate together. We measured ERPs while listeners were presented with auditory-only, audiovisual, or lip-read-only stimuli, in which the critical syllable that determined lexical status was naturally-timed (Experiment 1) or delayed by ∼800 ms (Experiment 2). We replicated the electrophysiological effect of stimulus lexicality, and also observed substantial effects of audiovisual speech integration for words and pseudowords. Critically, we found several early time-windows (< 400 ms) in which both contexts influenced auditory processes, but we never observed any cross-talk between the two types of speech context. The absence of any interaction between the two types of speech context supports the view that lip-read and lexical context mainly function separately, and may have different neural bases and purposes.