Download presentation
Presentation is loading. Please wait.
Published byYuliani Kusumo Modified over 5 years ago
1
Electrical Communications Systems ECE.09.433 Spring 2019
Lecture 1b January 24, 2019 Shreekanth Mandayam ECE Department Rowan University
2
ECOMMS: Topics
3
Plan Baseband and Bandpass Signals Intoduction to Information Theory
Recall: Comm. Sys. Block diagram Aside: Why go to higher frequencies? International & US Frequency Allocations Intoduction to Information Theory Recall: List of topics Probability Information Entropy Signals and Noise
4
Comm. Sys. Bock Diagram m(t) Tx Channel Rx s(t) r(t) Noise Baseband
Signal Baseband Signal Bandpass Signal “Low” Frequencies <20 kHz Original data rate “High” Frequencies >300 kHz Transmission data rate Demodulation or Detection Modulation Formal definitions will be provided later
5
Aside: Why go to higher frequencies?
Half-wave dipole antenna c = f l c = 3E+08 ms-1 Calculate l for f = 5 kHz f = 300 kHz Tx l/2 There are also other reasons for going from baseband to bandpass
6
Frequency Allocations
International Frequency Allocations: US Frequency Allocation Chart:
7
Information Recall: Information Source: a system that produces messages (waveforms or signals) Digital/Discrete Information Source: Produces a finite set of possible messages Digital/Discrete Waveform: A function of time that can only have discrete values Digital Communication System: Transfers information from a digital source to a digital sink Info Source Sink Comm System
8
Another Classification of Signals (Waveforms)
Deterministic Signals: Can be modeled as a completely specified function of time Random or Stochastic Signals: Cannot be completely specified as a function of time; must be modeled probabilistically What type of signals are information bearing?
9
Signals and Noise Lab 1 Comm. Waveform Signal (desired) Noise (undesired) Strictly, both signals and noise are stochastic and must be modeled as such We will make these approximations, initially: Noise is ignored Signals are deterministic
10
Measures of Information
Definitions Probability Information Entropy Source Rate Recall: Shannon’s Theorem If R < C = B log2(1 + S/N), then we can have error-free transmission in the presence of noise MATLAB DEMO: entropy.m
11
Summary
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.