Project

General

Profile

About our project » History » Version 21

Anderson PHILLIP, 11/08/2025 01:30 AM

1 9 Anderson PHILLIP
[[Wiki|← Back to Start Page]]
2
3 8 Anderson PHILLIP
h1. Project Details
4 1 Jordan Keiwein LAY
5 10 Jordan Keiwein LAY
---
6
7 1 Jordan Keiwein LAY
h2. Overview
8
9 2 HNIN Ei Shwe Yee
The goal of this project is to create a unified display by merging two projector images into one.  
10
To achieve this, we employed techniques such as *gamma correction*, *alpha blending*, and *intensity adjustments* to refine the final output.
11
12
The project team is organized into several roles:
13
* Project Manager
14
* Team Leader
15
* Doxygen Documentation Team
16
* Wiki Team
17
* Coding Team
18
* UML Design Team
19 1 Jordan Keiwein LAY
20 10 Jordan Keiwein LAY
---
21 1 Jordan Keiwein LAY
22 7 HNIN Ei Shwe Yee
h2. Software and Tools
23 1 Jordan Keiwein LAY
24 19 Anderson PHILLIP
|_. Tool |_. Description |
25
| *Redmine* | Used to manage project tasks, track progress, and organize team activities. |
26
| *Python* | Utilized for image processing operations such as blending, gamma correction, and intensity adjustments. |
27
| → **OpenCV** | A *Python library* for computer vision and image processing operations. |
28
| → **numpy** | A *Python library* that supports image processing operations with numerical computation functions. |
29
| → **json** | A built-in *Python library* for reading and writing JSON configuration files used in video and image processing. |
30
| → **tkinter** | A built-in *Python library* for creating the GUI and supporting interactive interface operations. |
31
| *Doxygen* | Employed to generate detailed code documentation automatically. |
32
| *Astah* | Used to design and visualize UML diagrams representing the system’s structure and workflow. |
33
34
35 1 Jordan Keiwein LAY
36 10 Jordan Keiwein LAY
---
37 1 Jordan Keiwein LAY
38 7 HNIN Ei Shwe Yee
h2. Technology
39 1 Jordan Keiwein LAY
40 7 HNIN Ei Shwe Yee
We plan to use a single flat-screen display illuminated by two projectors connected to separate computers, allowing both images to be projected simultaneously and combined into one seamless image.
41 1 Jordan Keiwein LAY
42 7 HNIN Ei Shwe Yee
To accomplish this, we will apply:
43
* Gamma correction method
44
* Alpha blending
45
* Intensity modification
46 1 Jordan Keiwein LAY
47 7 HNIN Ei Shwe Yee
p=. !{width:800px}car.png!
48 1 Jordan Keiwein LAY
49 10 Jordan Keiwein LAY
---
50
51 7 HNIN Ei Shwe Yee
h2. Alpha Blending Method
52 1 Jordan Keiwein LAY
53 7 HNIN Ei Shwe Yee
Alpha blending merges two images using a specific *alpha value* that determines transparency levels.  
54
This process is crucial in rendering and game development for combining visual elements smoothly and reducing jagged edges.
55 1 Jordan Keiwein LAY
56 7 HNIN Ei Shwe Yee
Our program uses:
57
* *Tkinter* – for the graphical interface  
58
* *OpenCV* – for image manipulation  
59
* *PIL* – for image rendering  
60 1 Jordan Keiwein LAY
61 7 HNIN Ei Shwe Yee
Users can load configuration files, apply gamma correction and alpha blending, and view/save the processed images.
62 20 Anderson PHILLIP
63
p=. !blendingexmaple.jpg!
64
65
h3. Interactive Alpha Blending Simulator
66
67
To visualize how *alpha blending* and *gamma correction* affect overlapping regions, we developed an interactive simulator.  
68
This tool allows users to modify blending parameters (Alpha A/B, Gamma, and Color) and observe real-time blending behavior between two projected regions.
69
70
*Access the simulator here:*  
71
"Alpha Blending Simulator":https://gilwebsite.pythonanywhere.com
72
73
_This simulator was developed by a team member as a supporting tool to demonstrate the blending principle used in our project._
74
75 1 Jordan Keiwein LAY
76 7 HNIN Ei Shwe Yee
External reference:  
77 21 Anderson PHILLIP
https://takinginitiative.net/2010/04/09/directx-10-tutorial-6-transparency-and-alpha-blending/
78 1 Jordan Keiwein LAY
79 10 Jordan Keiwein LAY
---
80 7 HNIN Ei Shwe Yee
81
h2. Gamma Correction Method
82
83 1 Jordan Keiwein LAY
This project uses a technique known as *gamma correction* , which modifies an image’s brightness by adjusting the luminance of each pixel in a *non-linear* way. This method aligns image brightness with *human visual perception* , as the human eye is more sensitive to variations in darker tones than in brighter ones. This biological adaptation allows us to see effectively across a wide range of lighting conditions. Gamma correction applies a *power-law transformation* to pixel values, as represented by the following equation:
84 11 Anderson PHILLIP
85
p=. !{width:800px}equation.png!
86 7 HNIN Ei Shwe Yee
87
When the gamma value (γ) is greater than 1, the image appears darker; when it is less than 1, the image appears brighter.
88
Note: Since pixel intensity values range between 0 and 255, the process involves first normalizing the image (dividing by 255), then applying the power function using the specified gamma value, and finally scaling the result back to the original 0–255 range.
89
90
*[Case1: γ=1.0(Before Gamma Correction)]*
91
92
p=. !{width:800px}before.png!
93
External reference:  
94
[http://rs.aoyaman.com/seminar/about10.html]
95
96
*[Case2: γ=0.8(After Gamma Correction)]*
97
98
p=. !{width:800px}after.png!
99
External reference:  
100
[http://rs.aoyaman.com/seminar/about10.html]
101
102
*[Case3: γ=2.0(After Gamma Correction)]*
103
104
p=. !{width:800px}after2.png!
105
External reference:
106
[http://rs.aoyaman.com/seminar/about10.html]
107 1 Jordan Keiwein LAY
108 10 Jordan Keiwein LAY
---
109
110 7 HNIN Ei Shwe Yee
h2. Intensity Modification
111
112
This method adjusts brightness at image edges using gamma correction and alpha blending depending on *image_side* value:
113
* image_side = 1 → intensity decreases toward the left edge  
114
* image_side = 0 → intensity decreases toward the right edge  
115 1 Jordan Keiwein LAY
By combining gamma correction and alpha blending, the method produces a smooth fading effect along the edges, allowing the image to blend seamlessly with the background or another image.
116 10 Jordan Keiwein LAY
117
---
118 7 HNIN Ei Shwe Yee
119
h2. Transparency
120
121
h3. Gradient Mask Creation
122
123
* The *create_mask()* method (in *MaskCreator* class) generates a smooth transparency gradient using the *smoothstep* function.
124
* This gradient determines the transparency level across the masked region, providing a non-linear transition from fully transparent to fully opaque (or vice versa).
125
* The smoothstep function is used to ensure a gradual and visually smooth change in opacity, minimizing harsh edges.
126
127
h3. Transparency Factor
128
129
* The *transparency_factor* parameter scales the gradient values, directly influencing overall transparency.
130
131
132
* Lower values = more transparent; values closer to 1 = higher opacity.
133
134
h3. Alpha Blending
135
136
* The *alpha_blending()* method applies the alpha gradient to the image within a defined *Region of Interest (ROI)*.
137
* Blending follows the equation:
138
139
p=. *blended = (α × ROI) + (1 − α) × background*
140
141
where:
142
(α × ROI): applies the transparency gradient to the image
143
(1 − α) × background : ensures a gradual transition into the black background.
144
145
h3. Transparency Side Control
146
147
* The direction of the gradient depends on *image_side*:
148
  *Right side* → gradient applied as-is
149
  *Left side* → gradient reversed
150
151
By combining the smoothstep gradient, transparency factor scaling, and alpha blending, this method enables precise control of transparency on the desired side of
152
the image, producing a seamless blending effect.
153 12 Jordan Keiwein LAY
154
---
155
156
h2. Limitations and Possible Enhancements (WIP)
157
158
h3. Limitations
159
160
* The current prototype mainly works with static images, so live video blending or projected alignment adjustments can be limited.
161
* Edge blending only supports left or right sides. Vertical or diagonal blending is not available, which limits flexibility in projector setups.
162
* The current prototype does not support automated calibration or geometric correction, so all alignment between two projectors must be manually handled.
163
* The current prototype expects identical projector usage, so any mismatch in projector properties (birghtness, contrast, color tone) can result in visible seams or image inconsistencies.
164 13 Jordan Keiwein LAY
* The current prototype uses fixed input file names, requiring it to be manually changed.
165 12 Jordan Keiwein LAY
166
h3. Possible Enhancements
167
168
* Implement the ability to adjust parameters like gamma, blend width, and gradient method dynamically in the GUI without restarting the program
169
* Support for multiple video codex, resolutions, and output formats
170
* Expand edge blending to include top, bottom, and diagonal sides