Resizing a form while keeping aspect ratio is useful in many cases, like video playback or vector graphics. This way, the window can be resized while retaining the original ratio and avoiding the use of letterboxing or pillarboxing.
What’s needed is for the window function to be overridden (WndProc) and pre-process the target window rectangle used by the WM_SIZING message.
The new destination rectangle is calculated by taking into account the resizing handle and the window chrome size (title height, border width, etc.).
protected override void WndProc(ref Message m)
if (m.Msg == WM_SIZING)
RECT rc = (RECT)Marshal.PtrToStructure(m.LParam, typeof(RECT));
int w = rc.Right – rc.Left – chromeWidth;
int h = rc.Bottom – rc.Top – chromeHeight;
switch (m.WParam.ToInt32()) // Resize handle
Marshal.StructureToPtr(rc, m.LParam, true);
You can find the full C# source code here, including a test program. The aspect ratio and initial client size is set to 16:9.
Lossless audio is used on various media, including studio masters, CD, DVD-Audio (via MLP) and Blu-ray (via Dolby TrueHD, which is technically a rebrand of and an extension to MLP, and DTS-HD Master Audio). All of these, when decoded, will result in a pulse-code modulated signal identical to the source, unlike the popular MP3 format. MP3 performs a quality-file size trade-off by discarding or reducing frequencies less audible to human hearing.
PCM by design uses a constant bitrate, which is proportional to the sample rate, bit depth and number of audio channels, which results in very large file sizes with the increasing of each parameter, and/or duration of the audio track.
Solutions such as FLAC, TrueHD and DTS-HD MA are used to losslessly compress the source audio so that the rest of the medium (for example, a Blu-ray disc) can be used for more audio tracks, higher-bandwidth video or extras.
Out of the aforementioned, only FLAC is free to use—both TrueHD and DTS-HD MA encoders and decoders have to be licensed.
The first part of the series will explore the processing of uncompressed audio data with the FLAC API in C#. In case you prefer to use Visual Basic .NET, you can use this online converter.
Anatomy of a WAVE file
In order to process a digital audio signal, we have to know three key parameters:
- The frequency, at which the signal was sampled. Usual sample rates are 44,100 Hz; 48,000 Hz, 96,000 Hz and rarely 192,000 Hz.
- The “depth” of each sample, measured in bits. The FLAC encoder supports up to 24 bits. Our C# WAVE reader will support 16 and 24 bits of audio data.
- The number of channels, which the recording consists of. This is usually mono, stereo (CD), 5.1 (DVD-Audio, Blu-ray) or 7.1 (Blu-ray).
The most common container for PCM audio data is the WAVE file format. As noted above, PCM has a constant bitrate of
SampleRate * BitDepth * Channels,
which makes it very easy to predict the size of each block of audio samples—a single second of audio data would be Bitrate / 8 bytes (8 bits in a byte)—e.g. 176.4KB for a second of CD-quality audio.
We can create the initialization method of the WavReader class by starting with an input Stream object. We have to ensure that there is enough available data for the wave format header, and check that the file is indeed a RIFF/WAVE file to avoid unnecessary reading and processing.
uRiffHeader = reader.ReadInt32();
uRiffHeaderSize = reader.ReadInt32();
uWaveHeader = reader.ReadInt32();
if (uRiffHeader != 0×46464952 /* RIFF */ ||
uWaveHeader != 0×45564157 /* WAVE */)
throw new Exception(”Invalid WAVE header!”);
Right after the RIFF chunk there can be a number of JUNK (padding) chunks, which we can skip and data and fmt chunks, whose data we need.
// Read all WAVE chunks
while (reader.BaseStream.Position < reader.BaseStream.Length)
int type = reader.ReadInt32();
int size = reader.ReadInt32();
long last = reader.BaseStream.Position;
case 0×61746164: /* data */
uDataHeader = type;
nTotalAudioBytes = size;
case 0×20746d66: /* fmt */
uFmtHeader = type;
uFmtHeaderSize = size;
format.wFormatTag = reader.ReadInt16();
format.nChannels = reader.ReadInt16();
format.nSamplesPerSec = reader.ReadInt32();
format.nAvgBytesPerSec = reader.ReadInt32();
format.nBlockAlign = reader.ReadInt16();
format.wBitsPerSample = reader.ReadInt16();
format.cbSize = reader.ReadInt16();
if (uDataHeader == 0) // Do not skip the ‘data’ chunk size
reader.BaseStream.Position = last + size;
Our WavReader class only supports 16 and 24-bit PCM samples, so we have to ensure that format format.wFormatTag is 1 (PCM) and format.wBitsPerSample ple is either 16 or 24. These limitations can be further removed by implementing sample rate conversion on-the-fly.
After all, headers are read, nTotalAudioBytes will contain the total count of audio data bytes in the WAVE file. To determine the duration of the audio file, we can simply divide it by the block size (Bitrate / 8 bytes).
The input stream will now be at the start of the audio samples. Every 16th or 24th bit (respectively, 2nd or 3rd byte) will mark the beginning of each sample. All audio samples are interleaved so the stream consists of:
channel 0, sample 0
channel 1, sample 0
channel n, sample 0
channel 0, sample 1
Now that we have reached the audio samples, we can start feeding them to FLAC.