Home Blog

Shakti Dubey IAS Topper Story: From Policeman’s Daughter to UPSC Rank 1

0
Shakti Dubey IAS

“Shakti Dubey IAS” is not just a name on the top of the UPSC Civil Services 2024 result list — it’s a symbol of perseverance, grit, and the triumph of humble beginnings. Her journey from the modest streets of Prayagraj to becoming the All India Rank 1 UPSC topper is a story that deserves to be told — and retold — as a source of inspiration for every aspirant in the country.

Background: Who is Shakti Dubey IAS?

Born and raised in Prayagraj, Uttar Pradesh, Shakti Dubey is the daughter of a Sub-Inspector in the Uttar Pradesh Police. She belongs to a middle-class family where discipline and integrity were values practiced more than preached. Her father’s life in the police service taught her early lessons in duty, resilience, and public service.

Growing up, there were no elite coaching centers or expensive resources at her disposal. What she had was something much more powerful — an unyielding will to serve the nation and change lives.

The Struggles Before Success: A Story of Resilience

The title “Shakti Dubey IAS” was not earned overnight.

This was her fifth attempt at cracking the UPSC Civil Services Examination — an exam known not just for its toughness, but for testing every fiber of a candidate’s emotional and intellectual resilience.

For four years, she faced rejection, disappointment, and emotional turmoil. But instead of giving up, Shakti chose to transform every setback into a setup for a comeback.

Her Strategy: Biochemistry, Newspaper Reading & Answer Writing

Unlike most toppers who choose more conventional optional subjects like PSIR or Sociology, Shakti Dubey chose Biochemistry — a highly technical subject — and mastered it with precision.

Her daily schedule included:

  • Reading The Hindu and Indian Express religiously
  • Practicing answer writing for both GS and Essay papers
  • Mock tests and peer evaluations
  • Staying consistent, even when motivation faded

One of her favorite self-written affirmations was:

“UPSC tests your preparation, but more than that, it tests your patience.”

UPSC CSE 2024 Result: Shakti Dubey Secures AIR 1

On April 22, 2025, the UPSC Civil Services Final Result 2024 was declared.

And there it was:

All India Rank 1: Shakti Dubey – Roll Number 240782

The news spread like wildfire. The media, coaching institutes, mentors, and aspirants across the country began searching:

Who is Shakti Dubey IAS?

Overnight, her journey became a national headline — not because she topped the exam, but because she did it against all odds, with authenticity, humility, and relentless consistency.

What Shakti Dubey IAS Said After the Result

When asked what kept her going despite four failed attempts, Shakti Dubey IAS replied:

“This rank isn’t just mine. It belongs to every girl who is told her dreams are too big. If I can do it, you can too. UPSC rewards consistency, not perfection.”

She also emphasized the importance of self-study, reading newspapers daily, and not comparing your journey to others.

Shakti Dubey’s Vision as an IAS Officer

Shakti has expressed a strong interest in working for:

  • Rural education reform
  • Healthcare access for marginalized communities
  • Women’s empowerment and safety

Her goal isn’t just to be an officer but to be a transformational leader. She believes bureaucracy should be accessible, empathetic, and rooted in ground realities.

Lessons from Shakti Dubey IAS for UPSC Aspirants

Here’s what every UPSC aspirant can learn from her journey:

1. Failure is feedback, not defeat

Shakti didn’t let four failed attempts define her. Instead, she used each one to fine-tune her strategy.

2. Optional subject doesn’t matter as much as mastery

Despite Biochemistry being rare, she made it her strength.

3. Mindset is everything

The way you think about your journey impacts how far you’ll go. Shakti’s mental strength was her biggest weapon.

Why “Shakti Dubey IAS” is Trending

In just days after the result, “Shakti Dubey IAS” became one of the most searched keywords related to UPSC in India. Her name is now associated with:

  • Rank 1 UPSC 2024
  • Inspirational success stories
  • UPSC preparation without coaching
  • UPSC Biochemistry optional strategy

Final Words: The Legacy Begins

Shakti Dubey IAS is not just a topper. She’s a torchbearer for every student from a small town with a big dream. Her story proves once again:

“The size of your dreams should always exceed the size of your obstacles.”

As she begins her training at LBSNAA and prepares for real-world governance challenges, one thing is certain — India has found not just a brilliant officer, but a deeply grounded one.

Quantum GANs – Generative Adversarial Networks: Quantum Approaches to Data Generation

0

Table of Contents

  1. Introduction
  2. Classical GANs: A Brief Overview
  3. Motivation for Quantum GANs
  4. Structure of a Quantum GAN (QGAN)
  5. Quantum Generator: Circuit-Based Design
  6. Quantum Discriminator Options
  7. Hybrid Classical-Quantum Architectures
  8. Objective Functions and Training
  9. Training QGANs on NISQ Devices
  10. Gradient Estimation in QGANs
  11. QGAN with Real-Valued Data
  12. QGANs for Image and State Generation
  13. QGANs vs Classical GANs
  14. Implementation with PennyLane
  15. Implementation with Qiskit
  16. Quantum Wasserstein GANs (QWGAN)
  17. Challenges in QGAN Training
  18. Experimental Realizations and Research
  19. Applications of QGANs
  20. Conclusion

1. Introduction

Quantum Generative Adversarial Networks (QGANs) are quantum analogs of classical GANs, designed to generate synthetic data through a competitive training process between two models: a generator and a discriminator.

2. Classical GANs: A Brief Overview

  • Generator creates fake samples
  • Discriminator classifies real vs fake
  • Minimax training:
    \[
    \min_G \max_D \mathbb{E}{x \sim p{ ext{data}}}[\log D(x)] + \mathbb{E}_{z \sim p_z}[\log(1 – D(G(z)))]
    \]

3. Motivation for Quantum GANs

  • Leverage high-dimensional quantum state spaces for generative modeling
  • Potential speedups in learning complex distributions
  • Natural fit for quantum data generation and simulation

4. Structure of a Quantum GAN (QGAN)

  • Quantum Generator: Variational quantum circuit
  • Discriminator: Classical or quantum model
  • Output: Distribution of quantum or classical samples

5. Quantum Generator: Circuit-Based Design

  • Input: random noise vector \( z \)
  • Output: quantum state or measured bitstring
  • Parameterized quantum gates (e.g., RY, RX, entangling gates)

6. Quantum Discriminator Options

  • Classical: Neural network acting on measurement outcomes
  • Quantum: Circuit with learnable gates and projective measurements

7. Hybrid Classical-Quantum Architectures

  • Classical noise sampled → encoded into quantum state → generator circuit → measured → classical discriminator
  • Gradient flows through hybrid backpropagation

8. Objective Functions and Training

  • Cross-entropy or Wasserstein loss
  • Optimize using classical optimizers (Adam, COBYLA)
  • Training alternates between generator and discriminator

9. Training QGANs on NISQ Devices

  • Keep circuit depth minimal
  • Use measurement error mitigation
  • Batch training with low-shot fidelity

10. Gradient Estimation in QGANs

  • Use parameter-shift rule:
    \[
    rac{\partial}{\partial heta} \langle O
    angle = rac{f( heta + \pi/2) – f( heta – \pi/2)}{2}
    \]

11. QGAN with Real-Valued Data

  • Encode real values into quantum rotations
  • Decode via expectation value mapping

12. QGANs for Image and State Generation

  • Low-resolution image generation (e.g., 4×4)
  • Quantum state preparation for simulation

13. QGANs vs Classical GANs

FeatureClassical GANQuantum GAN
Latent spaceReal-valued vectorsQuantum amplitudes
GeneratorNeural networkQuantum circuit
DiscriminatorNN or SVMQuantum or hybrid
ExpressivityHigh (deep nets)Potentially exponential

14. Implementation with PennyLane

import pennylane as qml
@qml.qnode(dev)
def generator_circuit(z, weights):
    qml.AngleEmbedding(z, wires=[0, 1])
    qml.StronglyEntanglingLayers(weights, wires=[0, 1])
    return qml.expval(qml.PauliZ(0))

15. Implementation with Qiskit

  • Use Aer simulator or real backend
  • Construct generator and discriminator using QuantumCircuit
  • Optimize with qiskit.algorithms.optimizers

16. Quantum Wasserstein GANs (QWGAN)

  • Use Wasserstein loss:
    \[
    L = \mathbb{E}{x \sim P{ ext{real}}}[D(x)] – \mathbb{E}_{z \sim P_z}[D(G(z))]
    \]
  • Lipschitz regularization required (e.g., gradient penalty)

17. Challenges in QGAN Training

  • Barren plateaus in generator circuit
  • Quantum noise and decoherence
  • Mode collapse and instability

18. Experimental Realizations and Research

  • IBM: Experimental QGAN in 2018
  • Numerous simulations in PennyLane and Cirq
  • Active area in QML research

19. Applications of QGANs

  • Synthetic data generation
  • Quantum chemistry state synthesis
  • Quantum data compression
  • Image enhancement in medical or quantum imaging

20. Conclusion

Quantum GANs bring the power of adversarial learning into the quantum domain. While limited by current hardware, they demonstrate how quantum circuits can learn and synthesize complex distributions, laying the foundation for quantum-native generative AI.

.

Quantum Boltzmann Machines: Quantum Models for Probabilistic Learning

0

Table of Contents

  1. Introduction
  2. Classical Boltzmann Machines Recap
  3. From Classical to Quantum Boltzmann Machines
  4. Structure of a Quantum Boltzmann Machine (QBM)
  5. Quantum Energy-Based Models
  6. Hamiltonian Representation in QBM
  7. Quantum States as Probability Distributions
  8. QBM vs Classical RBM and DBM
  9. Restricted Quantum Boltzmann Machines (RQBM)
  10. Training Quantum Boltzmann Machines
  11. Parameter Optimization with Contrastive Divergence
  12. Quantum Sampling and Measurement
  13. Gradient Estimation in QBMs
  14. QBM Implementation with D-Wave and Annealers
  15. QBM Implementation in Variational Quantum Circuits
  16. Applications of QBMs
  17. Challenges in Realizing QBMs
  18. Hybrid Quantum-Classical Strategies for QBM Training
  19. Software Frameworks Supporting QBM Models
  20. Conclusion

1. Introduction

Quantum Boltzmann Machines (QBMs) are quantum analogs of classical Boltzmann Machines, capable of representing and learning probability distributions using quantum mechanical systems. They are designed to harness quantum superposition and entanglement for generative modeling and statistical learning.

2. Classical Boltzmann Machines Recap

  • Energy-based probabilistic models
  • Represent distributions using energy functions:
    \[
    P(v) = rac{1}{Z} \sum_h e^{-E(v, h)}
    \]
  • Often trained using contrastive divergence

3. From Classical to Quantum Boltzmann Machines

  • Replace classical energy function with a quantum Hamiltonian
  • Use density matrices to represent probabilistic quantum states
  • Training involves manipulating Hamiltonian parameters

4. Structure of a Quantum Boltzmann Machine (QBM)

  • Nodes: visible and hidden qubits
  • Energy: represented by quantum Hamiltonian \( H \)
  • Quantum state: \(
    ho = e^{-eta H} / Z \)

5. Quantum Energy-Based Models

  • Define a Gibbs state:
    \[

ho = rac{e^{-eta H}}{Z}, \quad Z = ext{Tr}(e^{-eta H})
\]

  • The Hamiltonian encodes interactions between qubits

6. Hamiltonian Representation in QBM

Common form:
\[
H = \sum_i a_i Z_i + \sum_{i<j} b_{ij} Z_i Z_j + \sum_{i<j} c_{ij} X_i X_j
\]

  • Includes transverse fields (X), longitudinal fields (Z)

7. Quantum States as Probability Distributions

  • Measurement collapses superposition into observable outcomes
  • Gibbs distribution approximated by repeated sampling

8. QBM vs Classical RBM and DBM

ModelVisibilityTraining StrategyRepresentation
RBMBipartiteContrastive DivergenceClassical
DBMDeep LayersGreedy Layer-wiseClassical
QBMQuantum QubitsQuantum Annealing / VariationalQuantum Density Matrices

9. Restricted Quantum Boltzmann Machines (RQBM)

  • Constrain connections to visible-hidden only
  • Easier to train and simulate
  • Analogous to RBMs

10. Training Quantum Boltzmann Machines

  • Use cost functions like KL divergence
  • Maximize likelihood or minimize free energy
  • Variational techniques to optimize parameters

11. Parameter Optimization with Contrastive Divergence

  • Approximate gradient by difference in data-driven and model-driven samples
  • Quantum version uses variational circuits to compute loss and gradients

12. Quantum Sampling and Measurement

  • Use quantum annealers (D-Wave) or variational sampling
  • Multiple measurements used to construct probability distributions

13. Gradient Estimation in QBMs

  • Use parameter-shift rule for gradients:
    \[
    rac{\partial \langle H
    angle}{\partial heta} = rac{\langle H( heta + \pi/2)
    angle – \langle H( heta – \pi/2)
    angle}{2}
    \]

14. QBM Implementation with D-Wave and Annealers

  • D-Wave supports native Ising Hamiltonians
  • Limited control over entanglement and gate-level access

15. QBM Implementation in Variational Quantum Circuits

  • Use QAOA or VQE-style circuits to approximate Gibbs state
  • Optimize circuit parameters via classical feedback

16. Applications of QBMs

  • Generative modeling
  • Data compression
  • Quantum chemistry
  • Anomaly detection
  • Quantum-inspired autoencoders

17. Challenges in Realizing QBMs

  • State preparation and measurement fidelity
  • Gradient vanishing and barren plateaus
  • Complexity of simulating thermal states

18. Hybrid Quantum-Classical Strategies for QBM Training

  • Classical preprocessing (feature selection, PCA)
  • Quantum inference/sampling
  • Use hybrid loss functions combining classical and quantum observables

19. Software Frameworks Supporting QBM Models

  • PennyLane (variational circuits)
  • D-Wave Ocean SDK (annealing)
  • TensorFlow Quantum (experimental QBM-like workflows)

20. Conclusion

Quantum Boltzmann Machines provide a powerful framework for learning probability distributions on quantum hardware. Although still in early development, QBMs illustrate the potential of quantum energy-based learning models and pave the way toward quantum-native generative AI.

.

Quantum Principal Component Analysis (qPCA): Dimensionality Reduction with Quantum States

0

Table of Contents

  1. Introduction
  2. What Is Principal Component Analysis (PCA)?
  3. Motivation for Quantum PCA
  4. Quantum Representation of Covariance Matrices
  5. The qPCA Algorithm: Core Ideas
  6. Quantum Density Matrix as Covariance Proxy
  7. Step-by-Step Procedure of qPCA
  8. Using Quantum Phase Estimation in qPCA
  9. Extracting Principal Components from Quantum States
  10. Advantages of qPCA over Classical PCA
  11. Requirements and Assumptions
  12. Example Use Case: qPCA for Quantum State Compression
  13. Simulation and Benchmarking with qiskit.aqua.algorithms.qpca
  14. Implementation on Simulators vs Real Hardware
  15. qPCA for Anomaly Detection
  16. Comparison with Classical PCA Outputs
  17. Noise and Error Effects in qPCA
  18. Hybrid Strategies Combining PCA and qPCA
  19. Limitations and Current Research Challenges
  20. Conclusion

1. Introduction

Quantum Principal Component Analysis (qPCA) is a quantum algorithm inspired by classical PCA that extracts the dominant eigenvectors of a density matrix, offering exponential improvements in space complexity under certain conditions.

2. What Is Principal Component Analysis (PCA)?

  • PCA is a statistical method to reduce dimensionality by finding orthogonal directions (principal components) that capture the maximum variance in data.

3. Motivation for Quantum PCA

  • Classical PCA is computationally expensive on large datasets
  • qPCA uses quantum parallelism and phase estimation to extract eigenvalues and eigenvectors of a quantum density matrix

4. Quantum Representation of Covariance Matrices

  • In qPCA, the data covariance matrix is encoded as a quantum density matrix:
    \[

ho = rac{1}{M} \sum_{i=1}^M |\psi_i
angle \langle\psi_i|
\]

5. The qPCA Algorithm: Core Ideas

  • Prepare multiple copies of \(
    ho \)
  • Apply controlled unitary operations
  • Use Quantum Phase Estimation (QPE) to learn eigenvalues
  • Collapse system to principal components

6. Quantum Density Matrix as Covariance Proxy

  • A density matrix represents a mixture of quantum states and serves as the analog of the classical covariance matrix in qPCA

7. Step-by-Step Procedure of qPCA

  1. Input data → quantum state preparation
  2. Construct density matrix \(
    ho \)
  3. Use QPE to extract eigenvalues \( \lambda_i \)
  4. Measure and obtain eigenvectors \( |\phi_i
    angle \)

8. Using Quantum Phase Estimation in qPCA

  • QPE estimates the eigenvalues of a unitary operation
  • In qPCA, QPE is used to approximate eigenvalues of the density matrix acting as a unitary

9. Extracting Principal Components from Quantum States

  • Measurement outcomes provide information about the importance (weight) of each component
  • Output is a quantum state encoding dominant features

10. Advantages of qPCA over Classical PCA

  • Potential exponential speedup in data loading and eigendecomposition
  • Operates directly on quantum data or data embedded in quantum states

11. Requirements and Assumptions

  • Efficient state preparation
  • Multiple copies of \(
    ho \)
  • QPE implementation

12. Example Use Case: qPCA for Quantum State Compression

  • Reduce state dimension by truncating low-eigenvalue components
  • Used in quantum machine learning and simulation

13. Simulation and Benchmarking with qiskit.aqua.algorithms.qpca

  • Qiskit provided early implementations in Aqua (now deprecated)
  • Simulate using small datasets encoded as quantum states

14. Implementation on Simulators vs Real Hardware

  • Simulators can emulate density matrices
  • Real hardware limited by noise and number of qubit copies

15. qPCA for Anomaly Detection

  • Model principal components of normal data
  • Flag states with poor projection as anomalies

16. Comparison with Classical PCA Outputs

  • Classical PCA outputs numerical eigenvectors
  • qPCA produces quantum states that must be measured to extract components

17. Noise and Error Effects in qPCA

  • QPE is sensitive to decoherence
  • Circuit depth and entanglement increase error susceptibility

18. Hybrid Strategies Combining PCA and qPCA

  • Use classical PCA to pre-filter features
  • Apply qPCA on compressed input states

19. Limitations and Current Research Challenges

  • Requires high-fidelity multi-qubit operations
  • Needs multiple quantum state copies
  • Interpretability challenges in quantum output

20. Conclusion

Quantum PCA offers a promising route to perform efficient dimensionality reduction on quantum and classical data. While currently limited by hardware constraints, qPCA demonstrates how quantum algorithms can fundamentally change data preprocessing for machine learning and statistical analysis.

.

Quantum Support Vector Machines: Leveraging Quantum Kernels for Pattern Classification

0

Table of Contents

  1. Introduction
  2. Classical Support Vector Machines (SVMs)
  3. Motivation for Quantum SVMs
  4. Quantum Kernels in SVMs
  5. Quantum Feature Mapping
  6. Quantum Kernel Matrix Estimation
  7. SVM Decision Function with Quantum Kernels
  8. Training Quantum SVM Models
  9. Fidelity-Based Kernels and Overlap Circuits
  10. Implementation with Qiskit
  11. Implementation with PennyLane
  12. Optimization and Regularization
  13. Noise and Circuit Depth Considerations
  14. Performance Benchmarks on Simulators
  15. Running Quantum SVMs on Real Hardware
  16. Visualization of Quantum Decision Boundaries
  17. Multiclass Extensions of Quantum SVMs
  18. Applications and Use Cases
  19. Limitations and Future Directions
  20. Conclusion

1. Introduction

Quantum Support Vector Machines (QSVMs) apply the principles of kernel-based learning using quantum computers. By embedding data into quantum Hilbert space, QSVMs can compute kernel matrices that capture complex, nonlinear relationships.

2. Classical Support Vector Machines (SVMs)

  • SVMs find the hyperplane that maximizes margin between data classes.
  • Kernel trick allows separation in high-dimensional space using:
  • Linear, polynomial, Gaussian (RBF) kernels

3. Motivation for Quantum SVMs

  • Quantum computers represent data in exponentially large Hilbert spaces.
  • Inner products in these spaces can represent rich similarity metrics.
  • QSVMs offer potential advantages in expressiveness and speed.

4. Quantum Kernels in SVMs

  • A quantum kernel \( k(x, x’) \) is defined as the squared fidelity:
    \[
    k(x, x’) = |\langle \phi(x) | \phi(x’)
    angle|^2
    \]
  • \( \phi(x) \) is a quantum embedding of classical input \( x \)

5. Quantum Feature Mapping

  • Feature map \( U(x) \) is a parameterized quantum circuit that encodes data:
from qiskit.circuit.library import ZZFeatureMap
feature_map = ZZFeatureMap(feature_dimension=3, reps=2)

6. Quantum Kernel Matrix Estimation

  • Construct quantum circuits that compute pairwise fidelity between states.
  • Kernel matrix \( K_{ij} = k(x_i, x_j) \) is used by classical SVM solvers.

7. SVM Decision Function with Quantum Kernels

  • Trained using a classical optimizer (e.g., scikit-learn SVC)
  • Prediction:
    \[
    f(x) = \sum_i lpha_i y_i k(x_i, x) + b
    \]

8. Training Quantum SVM Models

  • Compute kernel matrix (quantum)
  • Solve dual SVM optimization (classical)
  • Predict test labels using quantum kernel evaluations

9. Fidelity-Based Kernels and Overlap Circuits

  • Overlap test estimates:
    \[
    |\langle \psi(x) | \psi(x’)
    angle|^2
    \]
  • Can be implemented using inverse feature maps:
qc.compose(feature_map.inverse()).compose(feature_map)

10. Implementation with Qiskit

from qiskit_machine_learning.kernels import QuantumKernel
qkernel = QuantumKernel(feature_map=feature_map, quantum_instance=backend)

11. Implementation with PennyLane

import pennylane as qml
qml.kernels.square_fidelity(x1, x2, feature_map)

12. Optimization and Regularization

  • Kernel SVM regularized with hyperparameter \( C \)
  • Prevents overfitting in high-dimensional space

13. Noise and Circuit Depth Considerations

  • Shorter feature maps preferred on NISQ devices
  • Simulators allow deeper circuits for benchmarking

14. Performance Benchmarks on Simulators

  • Use Iris, Wine, or Breast Cancer datasets
  • Compare accuracy with classical SVMs
  • Analyze kernel matrix separability

15. Running Quantum SVMs on Real Hardware

  • Use IBM QPU or AWS Braket
  • Shot noise and queue delays must be accounted for
  • Use error mitigation where available

16. Visualization of Quantum Decision Boundaries

  • Project 2D dataset using PCA
  • Visualize kernel-induced boundary with contour plots

17. Multiclass Extensions of Quantum SVMs

  • One-vs-rest strategy using binary QSVMs
  • Ensemble quantum classifiers

18. Applications and Use Cases

  • Fraud detection
  • Biometric classification
  • Quantum-enhanced recommendation systems

19. Limitations and Future Directions

  • Kernel estimation scales quadratically with data size
  • Limited by qubit count and fidelity
  • Promising for small- to mid-sized structured datasets

20. Conclusion

Quantum Support Vector Machines offer a practical framework for exploring quantum advantage in classification tasks. By leveraging quantum kernels, QSVMs extend the power of classical SVMs into the quantum domain, paving the way for future breakthroughs in quantum-enhanced learning.

.