2008-03-12, 09:40
Hi all,
Elan's recent commits to SVN sort out a lot of the issues that I think some of us (well, me anyway) have been patching and hacking, which is excellent. I now have a working build with the apple remote support, 5.1 mix down, "\" key etc plus things like mythtv integration.
I did have to make one small hack to get this build to compile however. I'm sure it's due to my noviceness in C++ on the Mac, so I wonder if anyone can help my understanding?
I understand that the pre-processor flag __APPLE__ is automatically picked up by gcc on the OSX platform. Also I see that _LINUX is set inside the Xcode project pre-proc directives. So were are treating the OSX port as a subset of the linux one, with Apple specific exceptions right?
OK, in the SVN log for CharsetConverter.cpp Elan says "OS X prefers its iconv input strings to be const char**",
But on line 354 the code is
#if defined(_LINUX) && !defined(__APPLE__)
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#else
if (iconv(m_iconvUtf8ToStringCharset, &src, &inBytes, &dst, &outBytes) == (size_t) -1)
#endif
Which seems to say, if we're on Linux and NOT Mac, then cast &src to (char**), otherwise (e.g we are on the Mac) then don't. On my machine this fails to compile. Gcc complains that you can't cast char** to const char**. If I change the code to ...
#if defined(_LINUX) && !defined(__APPLE__)
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#else
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#endif
..all works fine.
The logic in the code seems odd to me therefore. My question is, should the _LINUX flag be set? And if so, is the logic in that fragment the wrong way round?
I hope someone can shed some light on this, I expect it's something simple which I've overlooked or misunderstood. Any guidance gratefully recieved.
Thanks,
Spiderlane
Elan's recent commits to SVN sort out a lot of the issues that I think some of us (well, me anyway) have been patching and hacking, which is excellent. I now have a working build with the apple remote support, 5.1 mix down, "\" key etc plus things like mythtv integration.
I did have to make one small hack to get this build to compile however. I'm sure it's due to my noviceness in C++ on the Mac, so I wonder if anyone can help my understanding?
I understand that the pre-processor flag __APPLE__ is automatically picked up by gcc on the OSX platform. Also I see that _LINUX is set inside the Xcode project pre-proc directives. So were are treating the OSX port as a subset of the linux one, with Apple specific exceptions right?
OK, in the SVN log for CharsetConverter.cpp Elan says "OS X prefers its iconv input strings to be const char**",
But on line 354 the code is
#if defined(_LINUX) && !defined(__APPLE__)
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#else
if (iconv(m_iconvUtf8ToStringCharset, &src, &inBytes, &dst, &outBytes) == (size_t) -1)
#endif
Which seems to say, if we're on Linux and NOT Mac, then cast &src to (char**), otherwise (e.g we are on the Mac) then don't. On my machine this fails to compile. Gcc complains that you can't cast char** to const char**. If I change the code to ...
#if defined(_LINUX) && !defined(__APPLE__)
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#else
if (iconv(m_iconvUtf8ToStringCharset, (char**)&src, &inBytes, &dst, &outBytes) == (size_t) -1)
#endif
..all works fine.
The logic in the code seems odd to me therefore. My question is, should the _LINUX flag be set? And if so, is the logic in that fragment the wrong way round?
I hope someone can shed some light on this, I expect it's something simple which I've overlooked or misunderstood. Any guidance gratefully recieved.
Thanks,
Spiderlane