53220A Agilent Technologies Test Equipment, 53220A Datasheet - Page 2

no-image

53220A

Manufacturer Part Number
53220A
Description
Universal Frequency Counter/Timer, 350 MHz, 12 digit/sec, 100 ps
Manufacturer
Agilent Technologies Test Equipment
Type
Counter/Timer350 MHz Universal Frequency Counter/Timer, 12 digit/sec, 100 psr
Datasheet

Specifications of 53220A

Frequency Range
DC coupled 1 mHz to 350 MHz; AC coupled 10 Hz to 350 MHz
Input Impedance
Selectable 1 MΩ ±1.5% or 50Ω ±1.5%
Input Type
Front panel BNC
Input, Range
±5 V (± 50 V) full scale ranges
Noise
500 μVrms (max), 350 μVrms (typ)
Number Of Channels
3 channels
Sensitivity
DC - 100 MHz: 20 mVpk > 100 MHz: 40 mVpk
More Bandwidth
• 350 MHz baseband frequency
• 6 or 15 GHz optional microwave
More Resolution & Speed
• 12 digits/sec
• 20 ps single-shot time resolution
• Up to 75,000 and 90,000 readings/
More Insight
• Datalog trend plot
• Cumulative histogram
• Built-in math analysis and statistics
• 1M reading memory and USB Flash
More Connectivity
• LXI-C/Ethernet LAN, USB
• Optional GPIB interface
• Optional battery for portability and
More Measurement Capability
(53230A only)
• Continuous gap-free measurements
• Basic modulation domain analysis
• Optional pulse/burst microwave
Imagine your counter doing More!
Introduction
Frequency counters are depended
on in R&D and in manufacturing for
the fastest, most accurate frequency
and time interval measurements.
The 53200 Series of RF and universal
frequency counter/timers expands on
this expectation to provide you with
the most information, connectivity
and new measurement capabilities,
while building on the speed and
accuracy you’ve depended on with
Agilent’s decades of time and fre-
quency measurement expertise.
Three available models offer resolu-
tion capabilities up to 12 digits/sec
single-shot frequency resolution on
a one second gate. Single-shot time
interval measurements can be
resolved down to 20 psec. All models
offer new built-in analysis and
graphing capabilities to maximize the
insight and information you receive.
channels
sec (frequency and time interval)
storage
timebase accuracy
(MDA) and timestamp
measurement
Definitions
The following definitions apply to the specifications and characteristics
described throughout.
Specifi cation (spec)
The warranted performance of a calibrated instrument that has been stored for a
minimum of 2 hours within the operating temperature range of 0º C - 55º C and after a
45-minute warm up period. Automated calibration (*CAL?) performed within ±5ºC before
measurement. All specifi cations were created in compliance with ISO-17025 methods.
Data published in this document are specifi cations unless otherwise noted.
Typical (typ)
The characteristic performance, which 80% or more of manufactured instruments will
meet. This data is not warranted, does not include measurement uncertainty, and is
valid only at room temperature (approximately 23º C). Automated calibration (*CAL?)
performed within ±5º C before measurement.
Nominal (nom)
The mean or average characteristic performance, or the value of an attribute that is
determined by design such as a connector type, physical dimension, or operating speed.
This data is not warranted and is measured at room temperature (approximately 23º C).
Automated calibration (*CAL?) performed within ±5º C before measurement.
Measured (meas)
An attribute measured during development for purposes of communicating the expected
performance.
This data is not warranted and is measured at room temperature (approximately 23º C).
Automated calibration (*CAL?) performed within ±5º C before measurement.
Stability
Represents the 24-hour, ±1º C short-term, relative measurement accuracy.
Includes measurement error and 24-hour ± 1º C timebase aging error.
Accuracy
Represents the traceable measurement accuracy of a measurement for T
Includes measurement error, timebase error, and calibration source uncertainty.
Random measurement errors are combined using the root-sum-square method and are
multiplied by M for the desired confi dence level. Systematic errors are added linearly
and include time skew errors, trigger timing errors, and timebase errors as appropriate for
each measurement type.
T
Represents the ambient temperature of the instrument during the last adjustment to
calibration reference standards.
T
T
Represents the temperature of the instrument during the last automated calibration
(*CAL?) operation.
All information in this document are subject to change without notice.
CAL
CAL
ACAL
must be between 10º C to 45º C for a valid instrument calibration.
2
CAL
± 5º C.