On Sep 3 14:46, Corinna Vinschen wrote:
On Sep 2 05:51, Steven Penny wrote:
On Sun, 2 Sep 2018 10:07:10, Thomas Wolff wrote:
Actually, the width problem I suggested in my other response
(and even
referring to the wrong character) does not apply as mintty enforces
proper width in that case.
Also, even with fonts that do not provide the glyph, you will
usually
still see it by the Windows font fallback mechanism.
Shall I make it configurable?
your call - here are the possible resolutions - in order of my
preference:
1. Change the default to U+FFFD with no option
2. Change the default to U+FFFD with option to change
3. Leave default as is with option to change
Ideally we could check if the current font supports a visual
representation of 0xfffd and if not, fall back to 0x2592.
Not sure how feasible that is, but it doesn't seem to be overly
complicated. I'm just looking into a solution for the Cygwin
console.
Only, I can't get this working. In theory the GDI function
GetGlyphIndicesW is supposed to allow checking if a certain character
exists. But I'm getting a weird result. This code:
static const wchar_t replacement_char[2] =
{
0xfffd, /* REPLACEMENT CHARACTER */
0x2592 /* MEDIUM SHADE */
};
HWND cwnd = GetConsoleWindow ();
HDC cdc = GetDC (cwnd);
int rp_idx = 0;
WORD gi = 0;
DWORD ret = GetGlyphIndicesW (cdc, replacement_char, 1, &gi,
GGI_MARK_NONEXISTING_GLYPHS);
if (ret != GDI_ERROR && gi == 0xffff)
rp_idx = 1;
always sets rp_idx to 1 when called from inside the Cygwin DLL,
independently of the actual console font. And, here's the really
weird
thing, it always sets rp_idx to 0 when called directly from an
application, likewise independently of the actual console font.
Does anybody have an idea what I'm doing wrong?