Natural language processing - Wikipedia
Original SourceResource Reference
AI Synthesis
High-Signal Research
Overview
Natural Language Processing (NLP) is an interdisciplinary field of computer science and artificial intelligence focused on the automated interpretation, analysis, and generation of human language.
Key Insights
- Core Functional Pillars: Modern NLP systems are built around four primary technical tasks: speech recognition, text classification, natural language understanding (NLU), and natural language generation (NLG).
- Symbolic Era (1950s–1990s): Early research relied on "Symbolic NLP," which used hand-coded rules and conceptual ontologies; notable milestones include the Turing Test, the Georgetown translation experiment, and early chatbots like ELIZA.
- Statistical Revolution: In the late 1980s, the field shifted toward statistical models and machine learning, driven by increased computational power and the availability of large textual corpora (e.g., IBM’s translation models using EU and Canadian parliamentary records).
- Methodological Shift: The transition from rule-based systems to data-driven approaches marked a departure from Chomskyan generative grammar toward empirical corpus linguistics, allowing systems to learn language patterns from real-world data rather than rigid logic.