Transmission Line Fault Detection using Deep Learning

0.00

Total downloads: 27

Download Code  Discuss Code

Description

   Here’s a structured flow and explanation of the entire code you provided. This is a solution pipeline for time-series classification, specifically applied to power grid fault detection using a neural network combining **LSTM**, **Attention**, and **Capsule layers**.

📌 Overview

The pipeline consists of:

  1. Data Preprocessing
  2. Feature Engineering
  3. Model Design (LSTM + Attention + Capsule)
  4. Cross-Validation Training
  5. Threshold Optimization
  6. Test Prediction & Submission

📁 1. Data Preprocessing

df_train = pd.read_csv('../input/metadata_train.csv')
  • Reads metadata (labels and signal mapping) and sets a multi-index (id_measurement, phase).

Min-max Transformation

def min_max_transf(ts, min_data, max_data, range_needed=(-1,1)):
  • Scales raw signal values to a specified range (usually [-1, 1]) using linear scaling.

🛠️ 2. Feature Engineering

def transform_ts(ts, n_dim=160, min_max=(-1,1)):
  • Splits the time series into buckets.
  • For each bucket, it computes:

    • Mean, Std, Std bounds, Percentiles, Relative Percentiles
  • Output: Feature vector of shape [n_dim, 17] for each signal.
def prep_data(start, end):
  • Loads signal data from Parquet files.
  • Transforms all 3 phases of each measurement into features.
  • Concatenates features across phases.
load_all()
  • Loads the entire training dataset in two parts to manage memory.

🧮 3. Metric Definition

def matthews_correlation(...)
  • Implements Matthews Correlation Coefficient (MCC) as a metric.
  • MCC is ideal for binary imbalanced classification.

🧠 4. Custom Layers

🧲 Attention Layer

class Attention(Layer):
  • Learns attention weights to focus on important time steps.
  • Outputs a context vector by weighted sum over time.

🎯 Capsule Layer

class Capsule(Layer):
  • Captures spatial relationships via dynamic routing.
  • Applies squashing activation to constrain output vector length.

🏗️ 5. Model Architecture

def model_lstm(input_shape):

Model Structure:

  • Input: Shape (160, 51) — 3 phases × 17 features per phase
  • Layers:

    • 2 BiLSTM (CuDNNLSTM) layers
    • Attention Layer
    • Capsule Layer (on BiLSTM output)
    • CNN + Attention + Capsule (parallel path)
    • Concatenate → Dense → Output (sigmoid)
  • Compiled with binary crossentropy loss and MCC metric

🔁 6. Cross-Validation

splits = list(StratifiedKFold(n_splits=5)...)
  • Applies Stratified 5-fold Cross-Validation.
  • For each fold:

    • Clears Keras session.
    • Trains the model.
    • Saves best weights using ModelCheckpoint.
    • Stores validation predictions and scores.

🎯 7. Threshold Optimization

def threshold_search(y_true, y_proba):
  • Searches for best threshold for MCC.
  • MCC is computed for thresholds in [0.00, 0.99] using matthews_correlation.

🧪 8. Test Data Prediction

meta_test = pd.read_csv('../input/metadata_test.csv')
  • Test metadata is read and transformed using same pipeline.
for start, end in start_end:
  • Loads test data in chunks for memory efficiency.
  • Each test instance is represented by 3 phase signals → features → concatenated.
preds_test = []
  • Uses 5 trained models (from cross-validation).
  • Averages predictions and applies optimal threshold.
  • Maps the same scalar prediction to all 3 signals in a measurement.

📤 9. Submission

submission['target'] = preds_test
submission.to_csv('submission.csv', index=False)
  • Saves the final predicted labels into submission format.

📌 Summary Diagram of Model

                     Input (160 x 51)
                          │
             ┌────────────┼────────────┐
             │                           │
    [BiLSTM → BiLSTM]         [1D Conv]
             │                           │
    [Attention + Dropout]   [Attention] [Capsule + Flatten]
       │         │               │          │
   [Capsule + Flatten]           └────┬─────┘
             │                        │
             └─────[Concatenate]─────┘
                         │
                    [Dense + Dropout]
                         │
                       Sigmoid

✅ Key Points

  • Efficient time-series transformation using statistical features.
  • Hybrid deep model combining sequence modeling (LSTM), spatial focus (Attention), and hierarchical encoding (Capsules).
  • MCC used as evaluation metric due to dataset imbalance.
  • Cross-validation ensures robustness and threshold tuning optimizes final performance.

abhishek gupta

ScholarsColab.com is an innovative and first of its kind platform created by Vidhilekha Soft Solutions Pvt Ltd, a Startup recognized by the Department For Promotion Of Industry And Internal Trade, Ministry of Commerce and Industry, Government of India recognised innovative research startup.

12 reviews for Transmission Line Fault Detection using Deep Learning

  1. amal.khaleel

    good

  2. amal.khaleel

    good

  3. bk.chaitanya (verified owner)

    Good

  4. majid.ali (verified owner)

    thank you

  5. mario.manana (verified owner)

    Good seminar. Thank you very much for the initiative

  6. mario.manana (verified owner)

    Good seminar. Thank you very much for the initiative

  7. rushikesh.jadhav (verified owner)

    Good

  8. paulo jose da costa.branco (verified owner)

    none significant

  9. paulo jose da costa.branco (verified owner)

    none significant

  10. harsh wardhan.pandey (verified owner)

    Great Initiative

Only logged in customers who have purchased this product may leave a review.

No more offers for this product!