228,560 views
19 votes
19 votes
A signal is assumed to be bandlimited to kHz. It is desired to filter this signal with an ideal bandpass filter that will pass the frequencies between kHz and kHz by a system for processing analog signals composed of a digital filter with frequency response sandwiched between an ideal A/D and an ideal D/A, both operating at sampling interval . 1. Determine the Nyquist sampling frequency, (in kHz), for the input signal. 2. Find the largest sampling period (in s) for which the overall system comprising A/D, digital filter and D/A realize the desired band pass filter.

User Ivan Gabriele
by
2.7k points

1 Answer

16 votes
16 votes

Answer:

Hello your question is poorly written attached below is the complete question

answer :

1) 60 kHz

2) Tmax = ( 1 / 34000 ) secs

Step-by-step explanation:

1) Determine the Nyquist sampling frequency, (in kHz), for the input signal.

F(s) = 2 * Fmax

Fmax = 30 kHz ( since Xa(t) is band limited to 30 kHz )

∴ Nyquist sampling frequency ( F(s) ) = 2 * 30 = 60 kHz

2) Determine the largest sampling period (in s) .

Nyquist sampling period = 1 / Fs = ( 1 / 60000 ) s

but there is some aliasing of the input signal ( minimum aliasing frequency > cutoff frequency of filter ) hence we will use the relationship below

= 2π - 2π * T * 30kHz ≥ 2π * T * 4kHz

∴ T ≤
(1)/(34kHz)

largest sampling period ( Tmax ) = ( 1 / 34000 ) secs

A signal is assumed to be bandlimited to kHz. It is desired to filter this signal-example-1
User Pierrelouis
by
2.7k points