Date: | April 10, 2008 / year-entry #118 |
Tags: | code |
Orig Link: | https://blogs.msdn.microsoft.com/oldnewthing/20080410-00/?p=22793 |
Comments: | 14 |
Summary: | You're trying to compile your program and you're getting an error complaining that somebody already has a conflicting definition for a macro or some other name you're using. error: sample.cpp(35): conflicting definition of macro 'AWESOME' error: sample.cpp(92): conflicting definition of type 'AWESOME' If your compiler is helpful, it'll tell you where the previous definition was.... |
You're trying to compile your program and you're getting an error complaining that somebody already has a conflicting definition for a macro or some other name you're using. error: sample.cpp(35): conflicting definition of macro 'AWESOME' error: sample.cpp(92): conflicting definition of type 'AWESOME' If your compiler is helpful, it'll tell you where the previous definition was. But what if your compiler isn't quite so helpful? How can you find that conflicting definition? Turnabout is fair play. (I don't actually believe that turnabout is fair play, but it makes for a catchy title.)
The problem is that you're the second definition and
you want to find the first definition.
So jump to the head of the line and become the new first definition.
Compile the file with the
When the offending line is reached, the line that defines
the error: header.h(10): conflicting definition of macro 'AWESOME'
when the first definition is reached.
With your addition of the error: header.h(30): illegal character @ in source file when the conflicting type definition is reached. This time, instead of a conflicting macro definition, you created a syntax error.
On the other hand, if somebody As I noted, if your compiler is friendly and helpful, you won't need to use this tip, but sometimes you you have to make do with what you've got. |
Comments (14)
Comments are closed. |
And if the compiler were designed better, it would have tracked the origin line of the definitions, saving you time and labor. :)
I realize I’m probably violating one of the blog rules, but this series has brought a question to mind.
In a recent interview, Bjarne Stroustrup said the inclusion of macros in the C++ specification was a necessary mistake. From your articles, it is clear that macros can sometimes be a bit of a headache, so I thought I’d ask: what is your opinion on macros in programming languages? Mind you, I’m not looking for a treatise on the subject; a single sentence will do (for me, at least).
Counter question: Are you aware that in other languages macros might not just work by text substitution? Some actually add new production rules to the language. I hope I don’t need to tell you the benefits of this approach and how it eases programming…
How did I know Tom’s comment was going to bring out the Smug Lisp Weenies?
http://c2.com/cgi/wiki?SmugLispWeenie
A more frequently encountered scenario would be if the rogue header file had used #ifndef AWESOME before defining AWESOME… the suggested method wont work, and scary fact is… this kind of code is common.
This reminds me of the common sense development practice that too few developers actually do: Make a quick project and test assumptions.
Someone and I recently were talking about virtual vs non-virtual destructors and what happens when you have class A with virtual ~A inherited by class B with non-virtual ~B inherited by class C with non-virtual ~C.
Rather than wondering if A would destruct and B and C would not destruct, I just fired up the compiler and wrote the program. All 3 destructors were called just fine.
It’s amazing how little time it takes to make a test app, and test some hypothesis, and yet so many people just assume something without testing it. Maybe they are too lazy or not curious enough to find out the answers.
Abusing macros is bad. Adbusing templates can be much, much worse. That’s why I like C#. They don’t have macros, and generics aren’t quite the same thing as templates.
@mikefried: True – now the problem is this: Does it work because it is supposed to work that way, or is it just a coincidence that it works?
Also, I think macros are nastier than templates. windows.h for example:
#define Yield()
Which means that if you name a function Yield, it’ll be replaced with whitespace when you compile. Took me a bit to figure that one out.
Templates, at least, are namespaced and won’t bit you like this.
nice trick!
(now I need a "what you got" compiler to show off…)
pos-mortem-snarky comment: or just demand that you get a better one.
I once pondered writing a parser that reads windows headers and spits out C++-style headers that use enums and inline functions. I had no idea how to figure out the function parameters, though….
In this case, the answer is that since a base class has a virtual destructor all those classes will have virtual destructors. This is part of the C++ specification. (12.4.7 – If a class has a base class with a virtual destructor, its destructor (whether user or implicitly declared) is virtual.)
But your real point may have been that just because something works a particular way on one compiler does not necessarily mean that it is guaranteed behavior.
In that case (or something even more complex) what should be done? I think that in cases of macro redefinition it is better to change your macro name. Moreover people should refrain from defining their own macros if they are provided by standard system headers.
"sometimes you you have"
One less ‘you’ please :)
tcliu wrote:
I think you are digging too deep:
The point I was trying to make is that people need to avail themselves of the compiler more to answer their questions. Raymond’s "hack" to #define before the headers is one of many clever ways to use the tools. When you pick a set of tools, you commit to their shortcomings, so whenever you have a question about the tools, you should test the tools to find your answer. The most important part of learning on the job is to learn to understand your tools. Always take opportunities to question the tools when you have a question.
mikeb wrote:
That is a good point, not exactly what I was getting at, but a good point nonetheless. I like to put it the other way: Having book knowledge of the standard and being able to know why to expect one thing out of the language/platform is well and good, but platforms and tools are made by imperfect humans. They may be very well tested, but their behaviors are all testable by you. By testing them, you may learn something. No test is too small or stupid to run on your tools. Even the best developers get surprised sometimes.
mikefried: What you just said only applies half the time. If the standard says one thing, so you test your compiler, and it does something different, then what you said applies. It’s good to know when your tools don’t follow the standards.
But if the various applicable standards (and the tool documentation) say nothing about whatever you’re testing, then even if it works today, that means nothing about whether it’s actually correct.
In other words, if a test fails, that’s useful knowledge. But if a test succeeds, that means almost nothing. It just means that one particular instance of the tool-set works with one particular test. If the standards and documentation are silent, then just because it works today, in your environment, on your installation of the tools, doesn’t mean it will continue to work tomorrow, or in a different environment, or on a different installation of those tools.
(How many times has Raymond blogged about some compatibility hack that had to be inserted into various parts of windows because people were running exactly the type of test you’re talking about, and interpreting the positive test result as a guarantee that whatever they were doing was correct?)
Is this affected by the MS development environment’s insistence that the precompiled header inclusion is the first thing in the file? As I recall (sorry, I no longer have VS installed) anything above the #include <stdafx.h> was simply igonored.