The current must change at a rate of [tex]10^{4}[/tex] A/sec to induce an emf of 1000 volts.
We have a inductor of inductance 100 millihenry.
We have to calculate at what rate the current in the inductor have to change to induce an emf of 1000 Volts.
According to the faraday's law of induced voltage, the voltage induced in the conductor present in the time changing magnetic field is directly proportional to the negative of the rate of change of magnetic flux passing through it per unit time. Mathematically -
ε = [tex]- N\frac{d\phi}{dt}[/tex]
Where -
ε is the induced voltage
N is the number of turns
[tex]\frac{d\phi}{dt}[/tex] represents the rate of change of magnetic flux
Now, in the question it is given that -
L = 100 mH = 0.1 H
ε = 1000 Volts
Now - the voltage across the inductor is given by :
[tex]V = L \frac{di}{dt}[/tex]
[tex]\frac{di}{dt} = \frac{V}{L} = \frac{1000}{0.1} = 10^{4}[/tex] A/sec
Hence, the current must change at a rate of [tex]10^{4}[/tex] A/sec to induce an emf of 1000 volts.
To solve more questions on Inductors , visit the link below -
https://brainly.com/question/13112120
#SPJ4