BBM 444 Programming Assignment 4 Questions | HU

Published: 19 May, 2025
Category Assignment Subject Programming
University Hacettepe University Module Title BBM 444 Programming

Overview 

Digital cameras embed aforwardimage-signal-processing (ISP) chain that maps sensor RAW →white-balanced, demosaiced, tone-mapped sRGB. These operations compress data, but they obliterate the linear, noise-interpretable RAW signal that modern low-level vision algorithms need for denoising, HDR, super-resolution, and more. This assignment walks you through building and fine-tuning an inverse-ISP U-Net that reconstructs Bayer-pattern RAW fromsRGB, evaluates it quantitatively, and tests it on photos you shoot yourself. Deliverables:

  • Data pipeline: RAW↔ sRGB conversion, patch extraction, splits.
  • (U-Net implementation
  • Trainings
  • Quantitative report (PSNR, SSIM) & qualitative visualisations.
  • Discussion of failure cases and improvement ideas.
  • Bonus: Create a rendering model. 

Background & Motivation 

Unlike RGB, a Bayer RAW frame preserves linear radiometric mapping, giving: 

(a) physically meaningful noise statistics, 
(b) full 12-16-bit dynamic range, 
(c) explicit colour-filter (R/G/B) layout. 

These facilitate RAW-domain denoising, HDR, and super-resolution, but public RAW datasets are scarce. Inverse-ISP research attempts to convert JPEG back to RAW [1,2,3,4,5]. Remaining challenges include scene-dependent ISP settings and network capacity. In this homework, you will implement a practical pipeline, a 4-plane RAW output by fine-tuning U-Net, and analyse its limits on a dataset captured by the Samsung S20 FE camera.

Tasks & Points

This section describes every deliverable, the recommended procedure, and the points assigned to each part. 

Input Preparation (10 pts) — Create the linear-sRGB input with dcraw 

The network will not consume the JPEG provided by the camera. Instead, you must generate a linear RGB tensor from the RAW file itself. Use the five-step recipe below: 

Develop the RAW into sRGB TIFF format. 

Run the command below for each image to turn the raw image into processed sRGB images: dcraw -w -q 3 -o 1 -6 -g 11 -T IMG.dng w applies the camera's white-balance,-q 3 uses the high-quality AAHD demosaic, -o 1 converts to the standard sRGB matrix,-6 -Twrites a 16-bit TIFF.

Do You Need BBM 444 Assignment of This Question

Order Non Plagiarized Assignment

Loading 

Load all the TIFF asfloat64tensor. 

Downsampling 

Downsample the images with Nx and Ny such that so thatI[:: Nx,:: Ny] is at least 256×256. Then: 

(a) Crop any extra rows/cols (e.g. centre or corner-crop) so the result is exactly 256×256. 

Noise augmentation zero-mean Gaussian noise 

Inoisy = clip(I+N(0, σ2 ),0,1) ,σ∈[0.01,0.02].

Best Modem

select the best model with lowest val-loss (or highest val-PSNR). 

Metrics

report PSNR, SSIM, noise-ratio on the test set.

 Figure 2: Modified U-Net architecture for sRGB→4-plane RAW reconstruction. Here,Fdenotes the number of feature channels at each layer. In our setup,Fstarts at 8 (or could be larger) and doubles after each downsampling step. 

Pseudo-code for Stage I 

best_val_loss=inf
for epoch in 1..50: 
for (x,y) in train_loader_stage1: 
y_hat = model(x) 
loss= 0.8*L1(y_hat,y) + 0.2*(1-SSIM(y_hat,y)) 
backprop & step 
Evaluate on val_loader_stage1 
if val_loss < best_val_loss→best_val_loss=val_loss

Stage II: sRGB→Single-Channel Full-Res Bayer 

Now we repurpose the same U-Net but predict a single full-resolution Bayer mosaic: 

Architecture tweak

Extend the decoder with one additional upsampling block to double the resolution (from 2 →H). Move the final 1×1 convolution to this new block to produce a 1×H×Woutput. This additional block should also include a skip connection from the encoder's first layer.

Labels

Each ground truth is the raw Bayer mosaic, as a 1×H×Wtensor in [0,1]. 

Training recipe

identical to Stage I (same loss, optimiser, LR schedule, batch size, aug- augmentations, stopping criterion). 

Evaluation

report PSNR and SSIM over full-res mosaics; show a few visual comparisons. 

Pseudo-code for Stage II 

best_val_loss=inf 
for epoch in 1..50: 
for (x,mosaic) in train_loader_stage2: 
pred = model_stage2(x) 
loss = 0.8*L1(pred,mosaic) + 0.2*(1-SSIM(pred,mosaic)) 
backprop & step 
evaluate on val_loader_stage2 
if val_loss < best_val_loss→best_val_loss=val_loss 

By separating the two stages, you get both a packed 4-plane RAW converter and a full-resolution Bayer renderer, each trained end-to-end with identical hyperparameters for fair comparison. 

Quantitative Results (15 pts)

Evaluate the fine-tuned model on the test split and report the three metrics defined below. All tensors must be normalised to the range [0,1] before you compute any score. 

Peak Signal-to-Noise Ratio (PSNR)

PSNR(R,ˆ R) = 10 log10L 2 MSE(R,ˆ R) ,L= 1 

where MSE is the mean-squared error andL=1 because our images are scaled to [0,1]. You can call from torchmetrics.functional import peak_signal_noise_ratio as psnr psnr_val = psnr(pred, target, data_range=1.0) 

Structural Similarity Index (SSIM)SSIM measures perceived quality by comparing local luminance, contrast, and structure. It ranges from−1 to 1 (higher is better). Use: from torchmetrics.functional import (structural_similarity_index_measure as ssim) ssim_val = ssim(pred, target, data_range=1.0) 

Report the average PSNR (dB) and SSIM over all test patches in your notebook. Save the loss functions for

Training recipe:

Optimiser: Adam, lr = 10−4, cosine-anneal over 50 epochs. 
Batch 8, augment with flips/rotations/exposure jitter. 
50 epochs minimum. 

Evaluation: Report PSNR, SSIM, and heat-map [6] on the test set.

Visualisation: Show a few rendered sRGB crops vs. ground-truth sRGB side by side. You now have a full end-to-end forward ISP network: RAW→sRGB! 

Hints & Troubleshooting 

(1) Learning-rate sanity: divergence (loss shoots to 108 ornan) usually means the LR is too high for your batch size. Try LR = 5×10−5 or freeze the encoder for the first 5 epochs. 

(2) Out-of-memory (OOM)on GPU: 

  • halve batch_size (memory∝batch); training takes longer but fits. 
  • Enable mixed precision (torch.cuda.amp) to cut activations≈by 2×. 

(3) Trade-off reminder: smaller batches⇒less VRAM, but more iterations to finish an epoch, so wall-clock training is slower. 

(4) Limited Hardware?

 If you do not have access to a local GPU, you are encouraged to run your notebook on Google Colab ("T4 GPU" runtime is free) or another cloud service. We recommend using Colab for this assignment, as all provided notebooks have been tested and verified to work on Colab environments. You should periodically store your model checkpoints on Google Drive to avoid losing progress.

(5) Colab Usage Tip 

Since Google Colab imposes a 6-hour GPU usage limit, it is recommended to initially develop and debug your code on CPU mode to conserve GPU time. Switch to GPU only for actual model training or computationally heavy parts. This will maximise your available resources. 

Hire Experts to solve this assignment before your Deadline

Buy Today, Contact Us

(6)Training Tip 

You are encouraged to experiment with different hyperparameters or modify the network's learnable parameters if you believe it may improve performance. However, be sure to document any changes you make in the Discussion section. 

(7)Qualitative Visualisation Hint 

When visualising your model outputs on the selected test samples, apply the ISP pipeline to both the model's output and the corresponding ground-truth labels. Comparing the rendered outputs to the rendered labels (instead of the original sRGB inputs) ensures a fair and consistent evaluation.

Rubric (100 pts + 25 bonus) 

  • Input & Output pipeline- 20 pts
  • Model & Training- 40 pts
  • Quantitative results- 15 pts
  • Visualisation- 10 pts
  • Discussion- 15 pts
  • Bonus (task§4.7)- +25 pts

Do you need help with an assignment for BBM 444 Programming? Look no further! We are here for Programming assignment help. We also provide free assignment solutions written by PhD expert writers—100% original content, no plagiarism! Plus, we also provide assignment help, that too by complete before the deadline. Quality and accuracy are taken care of completely. So contact us today and be stress-free!

If you want to see the related solution of this brief, then click here:- Programming

Workingment Unique Features

Hire Assignment Helper Today!


Latest Free Samples for University Students

RBP020L063H Leadership and Change Management Assignment Sample

Category: Assignment

Subject: Management

University: University of Roehampton

Module Title: RBP020L063H Leadership and Change Management

View Free Samples

HRMM080 Ethical and Responsible Leadership AS2 Reflective Portfolio Sample

Category: Assignment

Subject: Management

University: University of Northampton

Module Title: HRMM080 Ethical and Responsible Leadership

View Free Samples

ACAD1346 The child’s live Experience Developing Confidence Learners Assignment Sample

Category: Assignment

Subject: Education

University: University of Greenwich (UOG)

Module Title: ACAD1346 The child’s live Experience Developing Confidence Learners

View Free Samples

NUR7011 Developing Healthcare Leaders Assignment Sample | BPP

Category: Assignment

Subject: Nursing

University: BPP University

Module Title: NUR7011 Developing Healthcare Leaders

View Free Samples

Project Management, Leadership and Skills: Planning & Control Portfolio Example

Category: Assignment

Subject: Management

University: University of Salford Manchester

Module Title: Project Management, Leadership and Skills: Planning & Control

View Free Samples
Online Assignment Help in UK