springrts.com: Lua Performance

springrts.com: Lua Performance

About Games Development Media Help Forums Wiki Report a bug Download
Lua Performance
Contents [hide]
1 Overview
2 Other Considerations
3 Performance Tests
3.1 TEST 1: Localize
3.2 TEST 2: Localized Class-Methods (with only 3 accesses!)
3.3 TEST 3: Unpack A Table
3.4 TEST 4: Determine Maximum And Set It (‘>’ vs. max)
3.5 TEST 5: Nil Checks (‘if’ vs. ‘or’)
3.6 TEST 6: ‘x^2’ vs. ‘x*x’
3.7 TEST 7: Modulus Operators (math.mod vs. %)
3.8 TEST 8: Functions As Param For Other Functions
3.9 TEST 9: for-loops
3.10 TEST 10: Array Access (with [ ]) vs. Object Access (with .method)
3.11 TEST 11: Buffered Table Item Access
3.12 TEST 12: Adding Table Items (table.insert vs. [ ])
3.13 TEST 12: Adding Table Items (mytable ={} vs. mytable={…})
4 Lua Garbage Collection
Overview
This page is copied from the CA wiki. The widget used in the performance tests is available from the CA SVN.
Other Considerations
It is a well known axiom in computing that
“We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil”
– Donald Knuth
Lua coders should keep that in mind, and especially when visiting this page. Readability and maintainability are in most cases just as important, and optimizing code for every last ounce of performance can severely impact those qualities. On the other hand, some of the optimizations suggested have little bearing on readability and should generally always be applied, e.g. localization of API functions, or actually make for neater code e.g. the use of or rather than a nil-check. Generally speaking, optimize only once you are sure that there is or will be a performance bottleneck.
Performance Tests
TEST 1: Localize
Code:
local min = math.min

Results:
Non-local: 0.719 (158%)
Localized: 0.453 (100%)
Conclusion:
Yes, we should localize all standard lua and Spring API functions.

TEST 2: Localized Class-Methods (with only 3 accesses!)
Code 1:
for i=1,1000000 do
local x = class.test()
local y = class.test()
local z = class.test()
end

Code 2:
for i=1,1000000 do
local test = class.test
local x = test()
local y = test()
local z = test()
end

Results:
Normal way: 1.203 (102%)
Localized: 1.172 (100%)
Conclusion:
No, it isn’t faster to localize a class method IN the function call.

TEST 3: Unpack A Table
Code 1:
for i=1,1000000 do
local x = min( a[1],a[2],a[3],a[4] )
end

Code 2:
local unpack = unpack
for i=1,1000000 do
local x = min( unpack(a) )
end

Code 3:
local function unpack4(a)
return a[1],a[2],a[3],a[4]
end
for i=1,1000000 do
local x = min( unpack4(a) )
end

Results:
with [ ]: 0.485 (100%)
unpack(): 1.093 (225%)
custom unpack4: 0.641 (131%)
Conclusion:
Don’t use unpack() in time critical code!

TEST 4: Determine Maximum And Set It (‘>’ vs. max)
Code 1:
local max = math.max
for i=1,1000000 do
x = max(random(cnt),x)
end

Code 2:
for i=1,1000000 do
local r = random(cnt)
if (r>x) then x = r end
end

Results:
math.max: 0.437 (156%)
‘if > then’: 0.282 (100%)
Conclusion:
Don’t use math.[max|min]() in time critical code!

TEST 5: Nil Checks (‘if’ vs. ‘or’)
Code 1:
for i=1,1000000 do
local y,x
if (random()>0.5) then y=1 end
if (y==nil) then x=1 else x=y end
end

Code 2:
for i=1,1000000 do
local y
if (random()>0.5) then y=1 end
local x=y or 1
end

Results:
nil-check: 0.297 (106%)
a=x or y: 0.281 (100%)
Conclusion:
The or-operator is faster than a nil-check. Use it!

TEST 6: ‘x^2’ vs. ‘x*x’
Code 1:
for i=1,1000000 do
local y = x^2
end

Code 2:
for i=1,1000000 do
local y = x*x
end

Results:
x^2: 1.422 (110%)
x*x: 1.297 (100%)
Conclusion:
The second syntax is marginally faster

TEST 7: Modulus Operators (math.mod vs. %)
Code 1:
local fmod = math.fmod
for i=1,1000000 do
if (fmod(i,30)<1) then
local x = 1
end
end

Code 2:
for i=1,1000000 do
if ((i%30)<1) then
local x = 1
end
end

Results:
math.mod: 0.281 (355%)
%: 0.079 (100%)
Conclusion:
Don't use math.fmod() for positive numbers (for negative ones % and fmod() have different results!)

TEST 8: Functions As Param For Other Functions
Code 1:
local func1 = function(a,b,func)
return func(a+b)
end

for i=1,1000000 do
local x = func1(1,2,function(a) return a*2 end)
end

Code 2:
local func1 = function(a,b,func)
return func(a+b)
end
local func2 = function(a)
return a*2
end

for i=1,1000000 do
local x = func1(1,2,func2)
end

Results:
defined in function param: 3.890 (1144%)
defined as local: 0.344 (100%)
Conclusion:
REALLY, LOCALIZE YOUR FUNCTIONS ALWAYS BEFORE SENDING THEM INTO ANOTHER FUNCTION!!! i.e if you use gl.BeginEnd(), gl.CreateList(), …!!!

TEST 9: for-loops
Code 1:
for i=1,1000000 do
for j,v in pairs(a) do
x=v
end
end

Code 2:
for i=1,1000000 do
for j,v in ipairs(a) do
x=v
end
end

Code 3:
for i=1,1000000 do
for i=1,100 do
x=a[i]
end
end

Code 4:
for i=1,1000000 do
for i=1,#a do
x=a[i]
end
end

Code 5:
for i=1,1000000 do
local length = #a
for i=1,length do
x=a[i]
end
end

Results:
pairs: 3.078 (217%)
ipairs: 3.344 (236%)
for i=1,x do: 1.422 (100%)
for i=1,#atable do 1.422 (100%)
for i=1,atable_length do: 1.562 (110%)
Conclusion:
Don't use pairs() or ipairs() in critical code! Try to save the table-size somewhere and use for i=1,x do!

TEST 10: Array Access (with [ ]) vs. Object Access (with .method)
Code 1:
for i=1,1000000 do
x = a["foo"]
end

Code 2:
for i=1,1000000 do
x = a.foo
end

Results:
atable["foo"]: 1.125 (100%)
atable.foo: 1.141 (101%)
Conclusion:
No difference.

TEST 11: Buffered Table Item Access
Code 1:
for i=1,1000000 do
for n=1,100 do
a[n].x=a[n].x+1
end
end

Code 2:
for i=1,1000000 do
for n=1,100 do
local y = a[n]
y.x=y.x+1
end
end

Results:
'a[n].x=a[n].x+1': 1.453 (127%)
'local y=a[n]; y.x=y.x+1': 1.140 (100%)
Conclusion:
Buffering can speed up table item access.

TEST 12: Adding Table Items (table.insert vs. [ ])
Code 1:
local tinsert = table.insert
for i=1,1000000 do
tinsert(a,i)
end

Code 2:
for i=1,1000000 do
a[i]=i
end

Code 3:
for i=1,1000000 do
a[#a+1]=i
end

Code 4:
local count = 1
for i=1,1000000 do
d[count]=i
count=count+1
end

Results:
table.insert: 1.250 (727%)
a[i]: 0.172 (100%)
a[#a+1]=x: 0.453 (263%)
a[count++]=x: 0.203 (118%)
Conclusion:
Don't use table.insert!!! Try to save the table-size somewhere and use a[count+1]=x!

TEST 12: Adding Table Items (mytable ={} vs. mytable={…})
When you write {true, true, true} , Lua knows beforehand that the table will need three slots in its array part, so Lua creates the table with that size. Similarly, if you write {x = 1, y = 2, z = 3}, Lua will create a table with four slots in its hash part.
As an example, the next loop runs in 2.0 seconds:

for i = 1, 1000000 do
local a = {}
a[1] = 1; a[2] = 2; a[3] = 3
end
If we create the tables with the right size, we reduce the run t ime to 0.7 seconds:
for i = 1, 1000000 do
local a = {true, true, true}
a[1] = 1; a[2] = 2; a[3] = 3
end
If you write something like {[1] = true, [2] = true, [3] = true}, however, Lua is not smart enough to detect that the given expressions (literal numbers, in this case) describe array indices, so it creates a table with four slots in its hash part, wasting memory and CPU time

China Shops Kleidung, Handys und Tablets zum Schnäppchenpreis

China Shops Kleidung, Handys und Tablets zum Schnäppchenpreis

Chinesische Onlineshops finden sich immer häufiger in den Suchergebnissen bei Google, dennoch begegnen viele ihnen mit eine gewissen Skepsis. Zu Unrecht, denn die Zeiten in denen chinesische Händler vor allem durch Produktfälschungen auf sich aufmerksam machten, sind vorbei. Stattdessen bieten die Internet Kaufhäuser aus China heute asiatische Markenware an, die sich was die Qualität angeht nicht hinter Produkten aus Deutschland, Europa oder den USA verstecken müssen. Das viele international bekannte Marken ihre Waren in China fertigen lassen und dann im eigenen Namen mit riesigen Gewinnspannen verkaufen, ist ein offenes Geheimnis.

Ein gutes Beispiel ist der deutsche Schuhhandel, der alleine 2013 fast 300 Millionen Paar Schuhe auf China importiert und mit einer durchschnittlichen Rendite von satten 47% weiterverkauft hat, siehe http://de.statista.com/themen/158/schuhhandel-in-deutschland/ Nicht anders bei Elektronik, hier liegen die Spannen teils noch wesentlich höher. Solche Zahlen veranschaulichen auch, wie viel sich beim Direktimport, als der Bestellung direkt in China, an Geld sparen lässt.

Wir beantworten die häufigsten Fragen rund um Shopping in China, geben Tipps aus der Praxis und zeigen Ihnen, bei welchen chinesischen Onlineshops Sie aus unserer eigenen Erfahrung bedenkenlos einkaufen können.

A DNA-based archival storage system

the morning paper

A DNA-Based Archival Storage System – Bornholt et al. ASPLOS ’16

It’s pretty cool that a paper on DNA-based storage makes a conference such as ASPLOS. And as you’ll see, there are good reasons we should be taking it very seriously indeed. DNA has some very interesting properties – it’s extremely dense (1 exabyte (109 GB) per mm3) and lasts a long-time (observed. half-life of over 500 years. It doesn’t work quite like this, but to put that density into context, imagine how many mm3 of raw storage you could fit into a USB-sized form factor. I’ll guess at least 100 – that would be 100,000 PB of storage in your thumb-drive!

In this paper the authors build a DNA-based key-value store supporting random access and are successfully able to store and retrieve data. Crucially, it seems that the internet’s cute kitten image archive can be…

View original post 1,531 more words

codeproject.com: The Ultimate Grid Home Page

codeproject.com: The Ultimate Grid Home Page

We are very happy to announce that we have made the decision to offer our commercial lineup of MFC libraries, including Ultimate Toolbox, Ultimate Grid, and Ultimate TCP/IP to the CodeProject.com community free of charge.

These are the full and complete libraries including source code, documentation, samples and examples.

Ultimate Toolbox, Ultimate Grid and Ultimate TCP/IP come with full source code, and are compatible with Microsoft Visual Studio versions 6.0 through 2005.

The Ultimate Toolbox and line of related MFC libraries products have been powering professional MFC applications for more than 10 years. We realize that there is a very large number of users who are still coding new applications in Visual C++ and MFC and who are looking for effective and proven libraries to enhance their applications, as well as those with legacy code to maintain and update.

By releasing these long standing MFC libraries to the CodeProject.com community we hope that the libraries will continue to grow, evolve and provide a library of useful controls to the development community through the auspices of The Code Project and its members.

codeproject.com: Nish Nishant, MFC under the hood

codeproject.com: Nish Nishant, MFC under the hood

Most MFC/VC++ programmers generate their projects using App Wizard and are quite happy with that. Once in a while, you have a programmer who will ask you, what happened to WinMain and he is normally satisfied with the answer given that WinMain is hidden within the MFC libraries. In this article I’ll try and explain the life-cycle of a typical MFC program. Before I do that I’d like to introduce you to the smallest MFC program you can write that will show a window on screen, other than by using a MessageBox.

Smallest MFC window-program

//NWinApp.h
class CNWinApp  : public CWinApp
{
public:
    BOOL InitInstance();
};
Hide   Copy Code
//NWinApp.cpp
#include 
#include "NWinApp.h"

CNWinApp app;

BOOL CNWinApp::InitInstance()
{
    CFrameWnd *pnframe=new CFrameWnd;
    m_pMainWnd=pnframe;
    pnframe-&gt;Create(0,"Buster");
    pnframe-&gt;ShowWindow(SW_SHOW);
    return TRUE;
}

So, what happened to good old WinMain?
When you run your program the kernel first calls this function, WinMainCRTStartup. WinMainCRTStartup first initializes the CRT routines. Then it parses the command line and removes the filename portion and calls WinMain passing the parsed command line as lpszCommandLine. But then where is WinMain? Smile | 🙂 It is defined in appmodul.cpp which you can find in your MFC\SRC directory. Here is how the function is implemented.

extern "C" int WINAPI
_tWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance,
	LPTSTR lpCmdLine, int nCmdShow)
{
    // call shared/exported WinMain
    return AfxWinMain(hInstance, hPrevInstance, lpCmdLine, nCmdShow);
}

As you will observe, WinMain simply calls AfxWinMain. AfxWinMain is defined in winmain.cpp which you will find under your MFC\SRC directory. I’ll list the function below exactly as it is defined.

int AFXAPI AfxWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance,
	LPTSTR lpCmdLine, int nCmdShow)
{
    ASSERT(hPrevInstance == NULL);

    int nReturnCode = -1;
    CWinThread* pThread = AfxGetThread();
    CWinApp* pApp = AfxGetApp();

    // AFX internal initialization
    if (!AfxWinInit(hInstance, hPrevInstance, lpCmdLine, nCmdShow))
        goto InitFailure;

    // App global initializations (rare)
    if (pApp != NULL &amp;&amp; !pApp-&gt;InitApplication())
        goto InitFailure;

    // Perform specific initializations
    if (!pThread-&gt;InitInstance())
    {
        if (pThread-&gt;m_pMainWnd != NULL)
        {
            TRACE0("Warning: Destroying non-NULL m_pMainWnd\n");
            pThread-&gt;m_pMainWnd-&gt;DestroyWindow();
        }
        nReturnCode = pThread-&gt;ExitInstance();
        goto InitFailure;
    }
    nReturnCode = pThread-&gt;Run();

InitFailure:
#ifdef _DEBUG
    // Check for missing AfxLockTempMap calls
    if (AfxGetModuleThreadState()-&gt;m_nTempMapLock != 0)
    {
        TRACE1("Warning: Temp map lock count non-zero (%ld).\n",
            AfxGetModuleThreadState()-&gt;m_nTempMapLock);
    }
    AfxLockTempMaps();
    AfxUnlockTempMaps(-1);
#endif

    AfxWinTerm();
    return nReturnCode;
}

As you can see the functions AfxGetThread and AfxGetApp are used to get pointers to the CWinApp derived global object and the current thread. If you are surprised as to how the global CWinApp derived object already exists, relax, C++ programs will first create all global and static objects before execution begins. All that happens much before AfxWinMain gets called. By the way, it must have been a slight shock to you to see a goto in there, eh?

Now there might be those of you who might be wondering where AfxGetThread and AfxGetApp got their information from. The answer is simple. Take a look at the CWinApp constructor in appcore.cpp. You’ll find the following two lines.

pThreadState-&gt;m_pCurrentWinThread = this;

and

pModuleState-&gt;m_pCurrentWinApp = this;
pThreadState is a AFX_MODULE_THREAD_STATE* and pModuleState is a AFX_MODULE_STATE*.

Thus when we create our global CNWinApp object, it’s constructor gets called and the AFX_MODULE_STATE structure is setup properly. Now AfxWinInit is called and this function initializes the MFC framework. Now there is a call to InitApplication [this is for backward compatibility with 16-bit applications]. Now it calls the InitInstance of the CWinApp derived object. And as you can see from our code listing above we have overridden InitInstance and created a CFrameWnd object there. I’ll repeat the code snippet here so that you won’t have to scroll upwards.

BOOL CNWinApp::InitInstance()
{
    CFrameWnd *pnframe=new CFrameWnd;
    m_pMainWnd=pnframe;
    pnframe-&gt;Create(0,"Buster");
    pnframe-&gt;ShowWindow(SW_SHOW);
    return TRUE;
}

I have created my CFrameWnd object on the heap, otherwise the moment InitInstance exits, the window will get destroyed. I have also set m_pMainWnd to point to my CFrameWnd window. Once InitInstance returns [if it returns false an error is assumed, so we return true], CWinApp::Run is called. Within the Run function is implemented our default message loop. Run keeps getting and dispatching messages till it receives a WM_QUIT message. Once WM_QUIT is received Run returns and control returns to AfxWinMain which performs clean-up and lastly calls AfxWinTerm which deletes all the global application structures that were created.

Well, that’s it. Pretty amusing to think that all this while you were writing MFC applications and you never bothered to think about what was happening under the hood.

DISCLAIMER
None of the information in this article is endorsed by me. I don’t work for Microsoft and all my assumptions are just that – assumptions. Some of the info I have given might indeed be incorrect, though I’d say the probability for that is rather low. In case I have erred I’ll be delighted if someone can correct those mistakes.

License
This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

theatlantic.com: Masters of Love Science says lasting relationships come down to—you guessed it—kindness and generosity

theatlantic.com: Masters of Love Science says lasting relationships come down to—you guessed it—kindness and generosity

Every day in June, the most popular wedding month of the year, about 13,000 American couples will say “I do,” committing to a lifelong relationship that will be full of friendship, joy, and love that will carry them forward to their final days on this earth.

Except, of course, it doesn’t work out that way for most people. The majority of marriages fail, either ending in divorce and separation or devolving into bitterness and dysfunction. Of all the people who get married, only three in ten remain in healthy, happy marriages, as psychologist Ty Tashiro points out in his book The Science of Happily Ever After, which was published earlier this year.

Social scientists first started studying marriages by observing them in action in the 1970s in response to a crisis: Married couples were divorcing at unprecedented rates. Worried about the impact these divorces would have on the children of the broken marriages, psychologists decided to cast their scientific net on couples, bringing them into the lab to observe them and determine what the ingredients of a healthy, lasting relationship were. Was each unhappy family unhappy in its own way, as Tolstoy claimed, or did the miserable marriages all share something toxic in common?

Psychologist John Gottman was one of those researchers. For the past four decades, he has studied thousands of couples in a quest to figure out what makes relationships work. I recently had the chance to interview Gottman and his wife Julie, also a psychologist, in New York City. Together, the renowned experts on marital stability run The Gottman Institute, which is devoted to helping couples build and maintain loving, healthy relationships based on scientific studies.

John Gottman began gathering his most critical findings in 1986, when he set up “The Love Lab” with his colleague Robert Levenson at the University of Washington. Gottman and Levenson brought newlyweds into the lab and watched them interact with each other. With a team of researchers, they hooked the couples up to electrodes and asked the couples to speak about their relationship, like how they met, a major conflict they were facing together, and a positive memory they had. As they spoke, the electrodes measured the subjects’ blood flow, heart rates, and how much they sweat they produced. Then the researchers sent the couples home and followed up with them six years later to see if they were still together.

From the data they gathered, Gottman separated the couples into two major groups: the masters and the disasters. The masters were still happily together after six years. The disasters had either broken up or were chronically unhappy in their marriages. When the researchers analyzed the data they gathered on the couples, they saw clear differences between the masters and disasters. The disasters looked calm during the interviews, but their physiology, measured by the electrodes, told a different story. Their heart rates were quick, their sweat glands were active, and their blood flow was fast. Following thousands of couples longitudinally, Gottman found that the more physiologically active the couples were in the lab, the quicker their relationships deteriorated over time.

But what does physiology have to do with anything? The problem was that the disasters showed all the signs of arousal—of being in fight-or-flight mode—in their relationships. Having a conversation sitting next to their spouse was, to their bodies, like facing off with a saber-toothed tiger. Even when they were talking about pleasant or mundane facets of their relationships, they were prepared to attack and be attacked. This sent their heart rates soaring and made them more aggressive toward each other. For example, each member of a couple could be talking about how their days had gone, and a highly aroused husband might say to his wife, “Why don’t you start talking about your day. It won’t take you very long.”

The masters, by contrast, showed low physiological arousal. They felt calm and connected together, which translated into warm and affectionate behavior, even when they fought. It’s not that the masters had, by default, a better physiological make-up than the disasters; it’s that masters had created a climate of trust and intimacy that made both of them more emotionally and thus physically comfortable.

Gottman wanted to know more about how the masters created that culture of love and intimacy, and how the disasters squashed it. In a follow-up study in 1990, he designed a lab on the University of Washington campus to look like a beautiful bed and breakfast retreat. He invited 130 newlywed couples to spend the day at this retreat and watched them as they did what couples normally do on vacation: cook, clean, listen to music, eat, chat, and hang out. And Gottman made a critical discovery in this study—one that gets at the heart of why some relationships thrive while others languish.

Throughout the day, partners would make requests for connection, what Gottman calls “bids.” For example, say that the husband is a bird enthusiast and notices a goldfinch fly across the yard. He might say to his wife, “Look at that beautiful bird outside!” He’s not just commenting on the bird here: he’s requesting a response from his wife—a sign of interest or support—hoping they’ll connect, however momentarily, over the bird.

The wife now has a choice. She can respond by either “turning toward” or “turning away” from her husband, as Gottman puts it. Though the bird-bid might seem minor and silly, it can actually reveal a lot about the health of the relationship. The husband thought the bird was important enough to bring it up in conversation and the question is whether his wife recognizes and respects that.

People who turned toward their partners in the study responded by engaging the bidder, showing interest and support in the bid. Those who didn’t—those who turned away—would not respond or respond minimally and continue doing whatever they were doing, like watching TV or reading the paper. Sometimes they would respond with overt hostility, saying something like, “Stop interrupting me, I’m reading.”

These bidding interactions had profound effects on marital well-being. Couples who had divorced after a six-year follow up had “turn-toward bids” 33 percent of the time. Only three in ten of their bids for emotional connection were met with intimacy. The couples who were still together after six years had “turn-toward bids” 87 percent of the time. Nine times out of ten, they were meeting their partner’s emotional needs.

By observing these types of interactions, Gottman can predict with up to 94 percent certainty whether couples—straight or gay, rich or poor, childless or not—will be broken up, together and unhappy, or together and happy several years later. Much of it comes down to the spirit couples bring to the relationship. Do they bring kindness and generosity; or contempt, criticism, and hostility?

“There’s a habit of mind that the masters have,” Gottman explained in an interview, “which is this: they are scanning social environment for things they can appreciate and say thank you for. They are building this culture of respect and appreciation very purposefully. Disasters are scanning the social environment for partners’ mistakes.”

“It’s not just scanning environment,” chimed in Julie Gottman. “It’s scanning the partner for what the partner is doing right or scanning him for what he’s doing wrong and criticizing versus respecting him and expressing appreciation.”

Contempt, they have found, is the number one factor that tears couples apart. People who are focused on criticizing their partners miss a whopping 50 percent of positive things their partners are doing and they see negativity when it’s not there. People who give their partner the cold shoulder—deliberately ignoring the partner or responding minimally—damage the relationship by making their partner feel worthless and invisible, as if they’re not there, not valued. And people who treat their partners with contempt and criticize them not only kill the love in the relationship, but they also kill their partner’s ability to fight off viruses and cancers. Being mean is the death knell of relationships.

Kindness, on the other hand, glues couples together. Research independent from theirs has shown that kindness (along with emotional stability) is the most important predictor of satisfaction and stability in a marriage. Kindness makes each partner feel cared for, understood, and validated—feel loved. “My bounty is as boundless as the sea,” says Shakespeare’s Juliet. “My love as deep; the more I give to thee, / The more I have, for both are infinite.” That’s how kindness works too: there’s a great deal of evidence showing the more someone receives or witnesses kindness, the more they will be kind themselves, which leads to upward spirals of love and generosity in a relationship.

There are two ways to think about kindness. You can think about it as a fixed trait: either you have it or you don’t. Or you could think of kindness as a muscle. In some people, that muscle is naturally stronger than in others, but it can grow stronger in everyone with exercise. Masters tend to think about kindness as a muscle. They know that they have to exercise it to keep it in shape. They know, in other words, that a good relationship requires sustained hard work.

“If your partner expresses a need,” explained Julie Gottman, “and you are tired, stressed, or distracted, then the generous spirit comes in when a partner makes a bid, and you still turn toward your partner.”

In that moment, the easy response may be to turn away from your partner and focus on your iPad or your book or the television, to mumble “Uh huh” and move on with your life, but neglecting small moments of emotional connection will slowly wear away at your relationship. Neglect creates distance between partners and breeds resentment in the one who is being ignored.

The hardest time to practice kindness is, of course, during a fight—but this is also the most important time to be kind. Letting contempt and aggression spiral out of control during a conflict can inflict irrevocable damage on a relationship.

“Kindness doesn’t mean that we don’t express our anger,” Julie Gottman explained, “but the kindness informs how we choose to express the anger. You can throw spears at your partner. Or you can explain why you’re hurt and angry, and that’s the kinder path.”

John Gottman elaborated on those spears: “Disasters will say things differently in a fight. Disasters will say ‘You’re late. What’s wrong with you? You’re just like your mom.’ Masters will say ‘I feel bad for picking on you about your lateness, and I know it’s not your fault, but it’s really annoying that you’re late again.’”

For the hundreds of thousands of couples getting married this month—and for the millions of couples currently together, married or not—the lesson from the research is clear: If you want to have a stable, healthy relationship, exercise kindness early and often.

When people think about practicing kindness, they often think about small acts of generosity, like buying each other little gifts or giving one another back rubs every now and then. While those are great examples of generosity, kindness can also be built into the very backbone of a relationship through the way partners interact with each other on a day-to-day basis, whether or not there are back rubs and chocolates involved.

One way to practice kindness is by being generous about your partner’s intentions. From the research of the Gottmans, we know that disasters see negativity in their relationship even when it is not there. An angry wife may assume, for example, that when her husband left the toilet seat up, he was deliberately trying to annoy her. But he may have just absent-mindedly forgotten to put the seat down.

Or say a wife is running late to dinner (again), and the husband assumes that she doesn’t value him enough to show up to their date on time after he took the trouble to make a reservation and leave work early so that they could spend a romantic evening together. But it turns out that the wife was running late because she stopped by a store to pick him up a gift for their special night out. Imagine her joining him for dinner, excited to deliver her gift, only to realize that he’s in a sour mood because he misinterpreted what was motivating her behavior. The ability to interpret your partner’s actions and intentions charitably can soften the sharp edge of conflict.

“Even in relationships where people are frustrated, it’s almost always the case that there are positive things going on and people trying to do the right thing,” psychologist Ty Tashiro told me. “A lot of times, a partner is trying to do the right thing even if it’s executed poorly. So appreciate the intent.”

Another powerful kindness strategy revolves around shared joy. One of the telltale signs of the disaster couples Gottman studied was their inability to connect over each other’s good news. When one person in the relationship shared the good news of, say, a promotion at work with excitement, the other would respond with wooden disinterest by checking his watch or shutting the conversation down with a comment like, “That’s nice.”

We’ve all heard that partners should be there for each other when the going gets rough. But research shows that being there for each other when things go right is actually more important for relationship quality. How someone responds to a partner’s good news can have dramatic consequences for the relationship.

In one study from 2006, psychological researcher Shelly Gable and her colleagues brought young adult couples into the lab to discuss recent positive events from their lives. They psychologists wanted to know how partners would respond to each other’s good news. They found that, in general, couples responded to each other’s good news in four different ways that they called: passive destructive, active destructive, passive constructive, and active constructive.

Let’s say that one partner had recently received the excellent news that she got into medical school. She would say something like “I got into my top choice med school!”

If her partner responded in a passive destructive manner, he would ignore the event. For example, he might say something like: “You wouldn’t believe the great news I got yesterday! I won a free t-shirt!”

If her partner responded in a passive constructive way, he would acknowledge the good news, but in a half-hearted, understated way. A typical passive constructive response is saying “That’s great, babe” as he texts his buddy on his phone.

In the third kind of response, active destructive, the partner would diminish the good news his partner just got: “Are you sure you can handle all the studying? And what about the cost? Med school is so expensive!”

Finally, there’s active constructive responding. If her partner responded in this way, he stopped what he was doing and engaged wholeheartedly with her: “That’s great! Congratulations! When did you find out? Did they call you? What classes will you take first semester?”

Among the four response styles, active constructive responding is the kindest. While the other response styles are joy-killers, active constructive responding allows the partner to savor her joy and gives the couple an opportunity to bond over the good news. In the parlance of the Gottmans, active constructive responding is a way of “turning toward” your partners bid (sharing the good news) rather than “turning away” from it.

Active constructive responding is critical for healthy relationships. In the 2006 study, Gable and her colleagues followed up with the couples two months later to see if they were still together. The psychologists found that the only difference between the couples who were together and those who broke up was active constructive responding. Those who showed genuine interest in their partner’s joys were more likely to be together. In an earlier study, Gable found that active constructive responding was also associated with higher relationship quality and more intimacy between partners.

There are many reasons why relationships fail, but if you look at what drives the deterioration of many relationships, it’s often a breakdown of kindness. As the normal stresses of a life together pile up—with children, career, friend, in-laws, and other distractions crowding out the time for romance and intimacy—couples may put less effort into their relationship and let the petty grievances they hold against one another tear them apart. In most marriages, levels of satisfaction drop dramatically within the first few years together. But among couples who not only endure, but live happily together for years and years, the spirit of kindness and generosity guides them forward.

Bildung, Einkommen und Elternhaus kommunaler Politikerinnen und Politiker in NRW

Bildung, Einkommen und Elternhaus kommunaler Politikerinnen und Politiker in NRW: die stark selektierte Gruppe der gewählten Mitglieder der Gemeinde- und Stadträte

Besitzen kommunale Abgeordnete NRW´s ein politisches Erbe ihrer Eltern? Diese Frage lässt sich anhand der vorliegenden Daten durchaus mit „ja“ beantworten. Dies wird besonders anhand der beiden „Volksparteien“ deutlich: 96% der politisch aktiven Elternteile der CDU-Befragten waren ebenfalls Mitglieder der CDU. Ähnlich verhält es sich mit den Befragten der SPD – hier gaben 80% der Personen, deren Mütter und Väter bereits politisch engagiert waren an, dass ihre Eltern ebenfalls Mitglied der SPD waren. Festzuhalten gilt also, dass politisch aktive Eltern der kommunalen Abgeordneten der Volksparteien ihren Kindern ein parteipolitisches Erbe hinterlassen haben. Für die übrigen untersuchten Parteien lässt sich kein stringentes Muster erkennen – die Parteiaktivität der Eltern ist weniger eindeutig und erstreckt sich teilweise über das gesamte Parteiensystem.

Dennoch gibt es Phänomene, die über alle Parteien hinweg gelten: Unabhängig von ihrer Parteizugehörigkeit scheinen Kommunale Abgeordnete durchschnittlich über höhere Schulabschlüsse zu verfügen als ihre Wähler. Eine Mehrheit von 53% der Befragten gibt das Abitur als ihren höchsten Schulabschluss an. In der Gesamtbevölkerung Nordrhein-Westfalens ist dieser Anteil mit 29% weitaus geringer. Demgegenüber bildet die Gruppe der Hauptschulabgänger mit 8% innerhalb der kommunalen Abgeordneten die kleinste Gruppe – in der Gesamtbevölkerung ist dieser Abschluss mit 34% am häufigsten vertreten.

Ebenfalls bemerkenswert ist die Verzerrung beim Einkommen der Abgeordneten im Vergleich zur Wahlbevölkerung, die in Tabelle 1 zu sehen ist.

Tabelle 1: Vergleich Einkommen

Einkommen Kommunale Eliten Allgemeine Bevölkerung
Bis 1300€ 18% 48%
1300€ – 2000€ 12% 30%
2000€ – 2900€ 28% 12%
2900€ – 3500€ 18% 6%
Über 3500€ 25% 4%
Die Quelle für den Bevölkerungsvergleich ist der Allbus 2014. Die Grundgesamtheit sind in Nordrhein-Westfalen lebende Personen, die das 18. Lebensjahr vollendet haben und die deutsche Staatsbürgerschaft haben. Beantwortet haben diese Frage 159 von 165 befragten kommunalen Abgeordneten.

2014 besaßen deutschlandweit 50% der Bevölkerung ein Einkommen von bis zu 1300€ netto pro Monat. Dieser Kennwert ist mit 48% in der Stichprobe für NRW nahezu identisch und die am stärksten frequentierte Einkommenskategorie. Im Vergleich dazu gibt nur ein knappes Fünftel der Befragten kommunalen Abgeordneten an, sich in dieser Einkommensgruppe zu befinden. Gespiegelt dazu verhält es sich mit der höchsten Einkommensklasse – Ein Viertel der Politiker verdient mehr als 3500€ netto pro Monat während nur 4% der Allgemeinen Bevölkerung NRWs ein solches Gehalt bezieht.

Auffällig ist weiterhin die Häufigkeit der Einkommensgruppen gemessen an ihrer Größe. Für die allgemeine Bevölkerung scheint zu gelten: Je höher das Einkommensniveau ist, desto seltener wird dieses erreicht. Dieser Befund gilt nicht für die befragten Kommunalen Abgeordneten bei denen die mittlere Kategorie von 2000 – 2900€ am meisten angegeben wird. Es lässt sich kein konkretes Muster erkennen. Abschließend lässt sich sagen, dass kommunale Abgeordnete im Schnitt deutlich wohlhabender sind als ihre Wähler und die Einkommensklassen daher zugunsten der Politiker enorm verzerrt sind.

Zusammenfassend lässt sich auf die Frage, um wen es sich bei den Kommunalpolitikern NRWs handelt, die Erklärung finden, dass es sich um überwiegend verheiratete gut verdienende Männer und Frauen handelt, die besser verdienende Angestellte sind und eine hohe formale Bildung besitzen. Auch nach lokalen Einheiten differenziert, aus denen die Kommunalpolitiker stammen, zeigen sich nur kleinere Unregelmäßigkeiten, wie etwa hinsichtlich des Familienstandes in großen Kommunen und der Bildungsverteilung in kleinen Kommunen.

Auf Grundlage all dieser Beobachtungen der Studie kann davon ausgegangen werden, dass es sich auch bei Kommunalpolitikern um eine Elite, im Sinne einer sich selbst rekrutierenden Gruppe handelt. Diese Punkte geben uns nicht nur ein besseres Bild und Verständnis von den Hintergründen der Kommunalpolitikern in NRW, sondern werfen auch Fragen auf, die beantwortet werden wollen.

ibm.com: Windows to UNIX porting, Part 1: Porting C/C++ sources

Windows to UNIX porting, Part 1: Porting C/C++ sources

Windows to UNIX porting, Part 1: Porting C/C++ sources

Demystifying the process of porting a C/C++-based project from Windows to UNIX

Software programs are often made to run on systems that are completely different from the system in which the program is coded or developed. This process of adapting software across systems is known as porting. You might need to port software for any one of several reasons. Perhaps your end users want to use the software in a new environment, such as a different version of UNIX®, or perhaps your developers are integrating their own code into the software to optimize it for your organization’s platform.

18 September 2007

Most Microsoft® Windows®-based projects are built using Microsoft Visual Studio®, which has a sophisticated integrated development environment (IDE) that automates almost the entire build process for the developer. In addition, Windows developers use Windows platform-specific application program interfaces (APIs), headers, and language extensions. Most UNIX®-like systems, such as SunOS, OpenBSD, and IRIX, don’t support an IDE or any Windows-specific headers or extensions, thereby making porting a time-consuming activity. To make matters more complicated, legacy Windows-based code was meant to be run on 16-bit or 32-bit x86 architecture. UNIX-based environments are often 64-bit, and most UNIX vendors don’t support the x86 instruction set. This article, the first in a two-part series, demystifies the process of porting a typical Visual C++ project in a Windows operating system to a g++ environment in SunOS while addressing the aforementioned issues in some detail.

C/C++ project types in Visual Studio

You can use a Visual C++ project to create one of three variants (single or multi-threaded) of a project:

Dynamic-link library (DLL or .dll)Static library (LIB or .lib)Executable (.exe)

For more complex variants, use a Visual Studio .NET solution—this solution makes it possible to create and manage multiple projects. The next couple of sections in this document focus on porting dynamic and static library project variants from Windows to UNIX.

Porting a DLL to a UNIX environment

The UNIX equivalent of a .dll file in Windows is a shared object (.so) file. However, the process of creating a .so file is rather different from that of creating a .dll file. Consider the example inListing 1, where you try to create a small .dll file that has a single function, printHello, which is called from the main routine in the main.cpp file.

Listing 1. File hello.h containing the declaration for the printHello routine

#ifdef BUILDING_DLL #define PRINT_API __declspec(dllexport) #else #define PRINT_API __declspec(dllimport) #endif extern “C” PRINT_API void printHello();

Listing 2 provides the source code for hello.cpp.

Listing 2. File hello.cpp

#include #include “hello.h” void printHello { std::cout << "hello Windows/UNIX users\n"; } extern "C" PRINT_API void printHello();

If you use the Microsoft 32-bit C/C++ standard compiler for 80×86 platforms (cl), the following command creates the hello.dll file:

cl /LD hello.cpp /DBUILDING_DLL

/LD instructs cl to create a .dll file. (It can be instructed to create other formats such as .exe or .obj.) /DBUILDING_DLL defines thePRINT_API macro for this particular building process so that the printHello symbol is exported from this DLL.

Listing 3 contains the main.cpp main source file, which uses the printHello routine. The assumption here is that hello.h, hello.cpp, and main.cpp are all in the same folder.

Listing 3. Main sources using the printHello routine

#include “hello.h” int main ( ) { printHello(); return 0; }

To compile and link the main code, use the following command line:

cl main.cpp hello.lib

A quick inspection of the sources and generated output reveals two important facts. First, the Windows-specific syntax,__declspec(dllexport), is needed to export any functions, variables, or classes from a DLL. Likewise, the Windows-specific syntax,__declspec(dllimport), is needed to import any functions, variables, or classes from a DLL. And second, the compilation generates two files: printHello.dll and printHello.lib. PrintHello.lib is used to link the main sources, and the UNIX headers for shared objects don’t need the declspec syntax. The output of a successful compilation is a single .so file that gets linked with the main sources.

To create a shared library in UNIX platforms using g++, compile all source files as relocatable shared objects by passing the -fPIC flag to g++. PIC stands for position independent code. A shared library is potentially mapped to a new memory address every time it gets loaded. Therefore, it makes sense to generate the addresses of all variables and functions inside the library in a way that can be easily computed relative to the start address that the library is loaded to. This code is generated by the -fPIC option and makes the code relocatable. The -o option is used to specify the name of an output file, and the -shared option builds a shared library in which unresolved references are allowed. To create the hello.so file, you must modify the header, as shown in Listing 4 below.

Listing 4. Modified header for hello.h with UNIX-specific changes

#if defined (__GNUC__) && defined(__unix__) #define PRINT_API __attribute__ ((__visibility__(“default”))) #elif defined (WIN32) #ifdef BUILDING_DLL #define PRINT_API __declspec(dllexport) #else #define PRINT_API __declspec(dllimport) #endif extern “C” PRINT_API void printHello();

And here’s the g++ command for linking the shared library hello.so:

g++ -fPIC -shared hello.cpp -o hello.so

To create the main executable, compile the sources:

g++ -o main main.cpp hello.so

Symbol hiding in g++

There are two typical ways to export symbols from a Windows-based DLL. The first method is to use __declspec(dllexport) only on select elements (for example, classes, global variables, or global functions) that are exported from the DLL. The second method is to use a module-definition (.def) file. .def files have their own syntax and contain the symbols that need to be exported from the DLL.

The default behavior of the g++ linker is to export all the symbols from a .so file. This might not be desirable, and it makes linking multiple DLLs a time-consuming task. To selectively export symbols from a shared library, use theg++ attribute mechanism. For example, consider that the user sources have two methods, ‘void print1();’ and ‘ int print2(char*);’, and the user needs to export print2 only. Listing 5 encloses a means of achieving this in both Windows and UNIX.

Listing 5. Hiding symbols in g++

#ifdef _MSC_VER // Visual Studio specific macro #ifdef BUILDING_DLL #define DLLEXPORT __declspec(dllexport) #else #define DLLEXPORT __declspec(dllimport) #endif #define DLLLOCAL #else #define DLLEXPORT __attribute__ ((visibility(“default”))) #define DLLLOCAL __attribute__ ((visibility(“hidden”))) #endif extern “C” DLLLOCAL void print1(); // print1 hidden extern “C” DLLEXPORT int print2(char*); // print2 exported

Using __attribute__ ((visibility(“hidden”))) prevents symbol exporting from a DLL. The latest versions of g++(4.0.0 and higher) also provide the -fvisibility switch, which you can use to selectively export symbols from a shared library. Using g++ with -fvisibility=hiddenin a command line suspends exporting of all symbols from a shared library, except for those that have been declared with __attribute__ ((visibility(“default”))). This is a neat way of telling g++ that every declaration that is not explicitly marked with a visibility attribute has a hidden visibility. Using dlsym to extract a hidden symbol returns NULL.

Overview of the attribute mechanism in g++

Much like the Visual Studio environment, which provides a lot of additional syntax on top ofC/C++, g++ supports many non-standard extensions to the language. One of these, the attribute mechanism in g++, is handy for porting purposes. The previous example discussed symbol hiding. Yet another use of attributes is to set function types, such as cdecl, stdcall, and fastcall, to Visual C++. Part 2 of this series discusses the attribute mechanism in greater detail.

Explicit DLL or shared object loading in a UNIX environment

In Windows systems, it is quite common to for a .dll file to be explicitly loaded by a Windows program. For example, consider a sophisticated Windows-based editor that has printing capabilities. Such an editor would dynamically load the DLL for the printer driver the first time a user makes the corresponding request. Windows-based developers use the Visual Studio-provided APIs, such as LoadLibrary, to explicitly load a DLL, GetProcAddress to query for a symbol from the DLL, and FreeLibrary to unload an explicitly loaded DLL. The UNIX equivalents for the same functions are thedlopen, dlsym, and dlclose routines. Further, in Windows, there’s a special DllMain method that is invoked the first time the DLL is loaded onto memory. UNIX-like systems have a corresponding method called _init.

Consider a variant of the previous example.Listing 6 is the loadlib.h header file, which is used in the sources that call the main method.

Listing 6. Header file loadlib.h

#ifndef __LOADLIB_H #define __LOADLIB_H #ifdef UNIX #include #endif #include using namespace std; typedef void* (*funcPtr)(); #ifdef UNIX # define IMPORT_DIRECTIVE __attribute__((__visibility__(“default”))) # define CALL #else # define IMPORT_DIRECTIVE __declspec(dllimport) # define CALL __stdcall #endif extern “C” { IMPORT_DIRECTIVE void* CALL LoadLibraryA(const char* sLibName); IMPORT_DIRECTIVE funcPtr CALL GetProcAddress( void* hModule, const char* lpProcName); IMPORT_DIRECTIVE bool CALL FreeLibrary(void* hLib); } #endif

The main method now explicitly loads the printHello.dll file and invokes the print method for the same, as shown in Listing 7.

Listing 7. Main file Loadlib.cpp

#include “loadlib.h” int main(int argc, char* argv[]) { #ifndef UNIX char* fileName = “hello.dll”; void* libraryHandle = LoadLibraryA(fileName); if (libraryHandle == NULL) cout << "dll not found" << endl; else // make a call to "printHello" from the hello.dll (GetProcAddress(libraryHandle, "printHello"))(); FreeLibrary(libraryHandle); #else // unix void (*voidfnc)(); char* fileName = "hello.so"; void* libraryHandle = dlopen(fileName, RTLD_LAZY); if (libraryHandle == NULL) cout << "shared object not found" << endl; else // make a call to "printHello" from the hello.so { voidfnc = (void (*)())dlsym(libraryHandle, "printHello"); (*voidfnc)(); } dlclose(libraryHandle); #endif return 0; }

DLL search path in Windows and UNIX environments

In Windows operating systems, a DLL is searched in the following order:

Directory where the executable is located (for example, notepad.exe in Windows)Current working directory (That is, the directory from which notepad.exe is launched.)Windows system directory (typically C:\Windows\System32)Windows directory (typically C:\Windows)Directories listed as part of the PATH environment variable

In UNIX-like systems, such as Solaris, the LD_LIBRARY_PATH environment variable specifies the shared library search order. The path to a new shared library needs to be appended to the LD_LIBRARY_PATH variable. The search order for HP-UX involves directories listed as part of LD_LIBRARY_PATH followed by those in SHLIB_PATH. For IBM AIX® operating systems, it’s the LIBPATH variable that determines the shared library search order.

Porting a static library from Windows to UNIX

The object code of static libraries, as opposed to dynamic link libraries, is linked when the application compiles and, as such, becomes a part of the application. Static libraries in UNIX systems follow a naming convention, where lib is prefixed and .a is suffixed to the library name. For example, the Windows user.lib file would typically be named libuser.a in a UNIX system. The operating system-provided commands arand ranlib are used to create static libraries.Listing 8 illustrates how to create a static library, libuser.a, from the user_sqrt1.cpp and user_log1.cpp source files.

Listing 8. Creating a static library in a UNIX environment

g++ -o user_sqrt1.o -c user_sqrt1.cpp g++ -o user_log1.o -c user_log1.cpp ar rc libuser.a user_sqrt1.o user_log1.o ranlib libuser.a

The ar tool creates a static library, libuser.a, and puts copies of the user_sqrt1.o and user_log1.o object files in it. If there is an existing library file, the object files are added to it. If the object files being used are newer than those inside the library, the older ones are replaced. The r flag replaces older object files in the library with newer versions of the same object files. If it doesn’t exist yet, the c option creates the library.

After a new archive is created or an existing one is modified, an index of archive contents needs to be created and stored as part of the archive. The index lists each symbol, defined by a member of an archive, that is a relocatable object file. The index speeds up linking with the static library and allows routines in the library to be called, irrespective of their actual placement inside the library. Note that the GNU ranlib is an extension of the ar tool, and invoking ar with an s argument, [ar -s], has the same effect as invoking ranlib.

Precompiled headers

C/C++-based applications in Visual C++ often use precompiled headers. Precompiled headers are a performance feature that certain compilers, such as cl in Visual Studio, provide to help speed up compilation. Complex applications often make use of header (.h or .hpp) files, which are sections of code that are meant to be included as part of one or more source files. The header files are modified only rarely during the scope of a project. Thus, to speed up compilation, these files can be converted into an intermediate form that is easier for the compiler to understand so that subsequent compilations are faster. This intermediate form is called precompiled header files or PCH in the Visual Studio environment.

Consider the example involving hello.cpp in Listings 1 and 2 earlier in this article. The inclusion of iostream and the definition of theEXPORT_API macro can be considered code-invariant parts of the file throughout the scope of the project. Thus, they are good candidates for inclusion in a header file. Listing 9 shows what the code looks like with the relevant changes.

Listing 9. Contents of precomp.h

#ifndef __PRECOMP_H #define __PRECOMP_H #include # if defined (__GNUC__) && defined(__unix__) # define EXPORT_API __attribute__((__visibility__(“default”))) # elif defined WIN32 # define EXPORT_API __declspec(dllexport) # endif

Listing 10 shows the source code of the DLL with the relevant changes.

Listing 10. Contents of new hello.cpp file

#include “precomp.h” #pragma hdrstop extern “C” EXPORT_API void printHello() { std::cout << "hello Windows/UNIX users" << std::endl; }

As the name suggests, a precompiled header file contains object code in a compiled form that is included before the header stop point. This point in the source file is usually marked by a lexeme that is not consumed as a token by the preprocessor, meaning one that is not a preprocessor directive. Alternatively, this header stop point can also be specified as #pragma hdrstop, if it is encountered in the sources before a valid non-preprocessor language keyword in the source text.

In a Solaris build, a precompiled header file is searched for when #include is in the compilation. As it searches for the included file, the compiler looks for a precompiled header in each directory just before it looks for the include file in that directory. The name searched for is the name specified in the #include with.gch appended. If the precompiled header file can’t be used, it is ignored.

Here is the command line for achieving precompiled header facility in Windows:

cl /Yc precomp.h hello.cpp /DWIN32 /LD

/Yc tells the cl compiler to generate the precompiled header from precomp.h. The same functionality is achieved in Solaris using the following lines:

g++ precomp.h g++ -fPIC -G hello.cpp -o hello.so

The first command creates the precompiled header precomp.h.gch. The rest of the procedure for generating the shared object is the same as described earlier in the article.

Note: Support for precompiled headers in g++ is available for versions 3.4 and above.

Conclusion

Porting across two completely divergent systems, such ase Windows and UNIX, is never an easy task and, as such, it requires a lot of tweaking and patience. This article explained the essentials of porting the most basic project types from a Visual Studio environment to ag++/Solaris-based one. The second and concluding article in this series discusses the multitude of compiler options available in the Visual Studio environment and their g++equivalents, the g++ attribute mechanism, some of the problems porting from a 32-bit (typically Windows) to a 64-bit (UNIX) environment, and multithreading.