Note to recruiters

Note to recruiters: We are quite aware that recruiters, interviewers, VCs and other professionals generally perform a Google Search before they interview someone, take a pitch from someone, et cetera. Please keep in mind that not everything put on the Internet must align directly to one's future career and/or one's future product portfolio. Sometimes, people do put things on the Internet just because. Just because. It may be out of their personal interests, which may have nothing to do with their professional interests. Or it may be for some other reason. Recruiters seem to have this wrong-headed notion that if somebody is not signalling their interests in a certain area online, then that means that they are not interested in that area at all. It is worth pointing out that economics pretty much underlies the areas of marketing, strategy, operations and finance. And this blog is about economics. With metta, let us. by all means, be reflective about this whole business of business. Also, see our post on "The Multi-faceted Identity Problem".

Saturday, October 15, 2016

TECHNOLOGY: Skilled Foreign Workers a Boon to Pay, Study Finds

Want a pay raise? Ask your employer to hire more immigrant scientists.

That's the general conclusion of a study that examined wage data and immigration in 219 metropolitan areas from 1990 to 2010. Researchers found that cities seeing the biggest influx of foreign-born workers in science, technology, engineering and mathematics—the so-called STEM professions—saw wages climb fastest for the native-born, college-educated population.

Wednesday, September 14, 2016

TECHNOLOGY: Stem pipeline problems to aid STEM diversity


Decades of effort to increase the number of minority students entering the metaphorical science, technology, engineering, and math (STEM) pipeline, haven’t changed this fact: Traditionally underrepresented groups remain underrepresented. In a new paper in the journal BioScience, two Brown University biologists analyze the pipeline’s flawed flow and propose four research-based ideas to ensure that more students emerge from the far end with Ph.D.s and STEM careers.

Thursday, August 11, 2016

INNOVATION: World’s First Parallel Computer Based on Biomolecular Motor

And now, news from Germany.
A new parallel-computing approach can solve combinatorial problems, according to a study published in Proceedings of the National Academy of Sciences. Researchers from the Max Planck Institute of Molecular Cell Biology and Genetics and the Dresden University of Technology collaborated with an international team on the technology. The researchers note significant advances have been made in conventional electronic computers in the past decades, but their sequential nature prevents them from solving problems of a combinatorial nature. The number of calculations required to solve such problems grows with the size of the problem, making them intractable for sequential computing. The new approach addresses these issues by combining well-established nanofabrication technology with molecular motors that are very energy-efficient and inherently work in parallel. The researchers demonstrated the parallel-computing approach on a benchmark combinatorial problem that is very difficult to solve with sequential computers. The team says the approach is scalable, error-tolerant, and dramatically improves the time to solve combinatorial problems of size N. The problem to be solved is "encoded" within a network of nanoscale channels by both mathematically designing a geometrical network that is capable of representing the problem, and by fabricating a physical network based on this design using lithography. The network is then explored in parallel by many protein filaments self-propelled by a molecular layer of motor proteins covering the bottom of the channels.

Saturday, July 2, 2016

INNOVATION: Computers read 1.8 billion words of fiction to learn how to anticipate human behaviour

Meanwhile at Stanford:
Researchers at Stanford University are using 600,000 fictional stories to inform their new knowledge base called Augur. The team considers the approach to be an easier, more affordable, and more effective way to train computers to understand and anticipate human behavior. Augur is designed to power vector machines in making predictions about what an individual user might be about to do, or want to do next. The system's current success rate is 71 percent for unsupervised predictions of what a user will do next, and 96 percent for recall, or identification of human events. The researchers report dramatic stories can introduce comical errors into a machine-based prediction system. "While we tend to think about stories in terms of the dramatic and unusual events that shape their plots, stories are also filled with prosaic information about how we navigate and react to our everyday surroundings," they say. The researchers note artificial intelligence will need to put scenes and objects into an appropriate context. They say crowdsourcing or similar user-feedback systems will likely be needed to amend some of the more dramatic associations certain objects or situations might inspire.

Monday, May 30, 2016

INNOVATION: Mathematical model to explain how things go viral

Interesting research on virality. At the University of Aberdeen:
A University of Aberdeen-led research team has developed a model that explains how things go viral in social networks, and it includes the impact of friends and acquaintances in the sudden spread of new ideas. "Mathematical models proposed in the past typically neglected the synergistic effects of acquaintances and were unable to explain explosive contagion, but we show that these effects are ultimately responsible for whether something catches on quickly," says University of Aberdeen researcher Francisco Perez-Reche. The model shows people's opposition to accepting a new idea acts as a barrier to large contagion, until the transmission of the phenomenon becomes strong enough to overcome that reluctance. Although social media makes the explosive contagion phenomenon more apparent in everyday life than ever before, it is the intrinsic value of the idea or product, and whether friends and acquaintances adopt it or not, which remains the crucial factor. The model potentially could be used to address social issues, or by companies to give their product an edge over competitors. "Our conclusions rely on numerical simulations and analytical calculations for a variety of contagion models, and we anticipate that the new understanding provided by our study will have important implications in real social scenarios," Perez-Reche says.

Monday, April 25, 2016

On the "Smartest Living People in the World"

I was lazily whiling my time away on the Internet - a very small bit of time, because I am just very busy these days - and here is something I came across.

A few comments on the video: it is certainly true that Stephen Hawking is an awe-inspiring figure in academic physics. But to compare him with the other people seems odd. It simply does not do enough justice to the towering achievements of this eminent physicist. Using IQ as a statistical measure hardly does any justice to the scope and depth of Hawking's work. (Indeed, I was trying to impress upon a mathematician friend of mine that intelligence should be viewed as a partial order, not a total order. In that IQ tends to represent intelligence on a flat, linear scale, IQ cannot be used as the only measure of intelligence.) Hawking is the sort of genius whose work cannot be easily compared to that of anyone else, and it is quite wrong headed to even compare him against the others in this list insofar as actual scholarly work is concerned. A second comment would also be in order. Regarding Christopher Langan: I must note a correction to what has been claimed in the video. Langan's work, while done in earnest, does not quite stand up to scholarly scrutiny. Here is my Quora post on the same.


What does the Theoretic Model of the Universe by Christopher Langan say?

Anand ManikuttyIndependent stats & compsci consultant for companies in finance, s/w & hi tech

The Theoretic Model of the Universe by Christopher Langan is supposed to be what the title says - a model of the universe.

I have not read the whole thing, but the parts that I went through did not stand up to scholarly scrutiny. In fact, I can say with a great deal of confidence that the Theoretic Model of the Universe (or the Cognitive-Theoretic Model of the Universe) is mistaken from multiple perspectives - physics, and psychology, just to name a couple. Each field has its own episteme. You need to understand the epistemes of a particular field to comment on a work such as this. I can say that based on my knowledge of the fields of physics and psychology that the entire enterprise is misguided to begin with.
Wishing Chris Langan the best, of course, in his scholarly pursuits. You never know what may be possible until you try. At least he tried.

Friday, April 15, 2016

TECHNOLOGY: How Can Supercomputers Survive a Drought?

Water scarcity has been surfacing as an extremely critical issue worth addressing in the U.S. as well as around the globe nowadays. A McKinsey-led report shows that, by 2030, the global water demand is expected to exceed the supply by 40%. According to another recent report by The Congressional Research Service (CRS), more than 70% of the land area in the U.S. underwent drought condition during August, 2012.

When it comes to 2014, the condition has become even worse in some of the states: following a three-year dry period, California declared state-wide drought emergency. A report by NBC News on this drought quotes California Gov. Jerry Brown as saying, “perhaps the worst drought California has ever seen since records began being kept about 100 years ago”. Many such evidences of extended droughts and water scarcity have undoubtedly necessitated concerted approaches to tackling the global crisis and ensuring water sustainability.

Supercomputers are notorious for consuming a significant amount of electricity, but a less-known fact is that supercomputers are also extremely “thirsty” and consume a huge amount of water to cool down servers through cooling towers that are typically located on the roof of supercomputer facilities. While high-density servers packed in a supercomputer center can save space and/or costs, they also generate a large amount of heat which, if not properly removed, could damage the equipment and result in huge economic losses.

The high heat capacity makes water an ideal and energy-efficient medium to reject server heat into the environment through evaporation, an old yet effective cooling mechanism. According to Amazon’s James Hamilton, a 15MW data center could guzzle up to 360,000 gallons of water per day. The U.S. National Security Agency’s data center in Utah would require up to 1.7 million gallons of water per day, enough to satiate over 10,000 households’ water needs.

Although water consumption is related to energy consumption, they also differ from each other: due to time-varying water efficiency resulting from volatile outside temperatures, the same amount of server energy but consumed at different times may also result in different amount of water evaporation in cooling towers. In addition to onsite cooling towers, the enormous appetite for electricity also holds supercomputers accountable for offsite water consumption embedded in electricity production. As a matter of fact, electricity production accounts for the largest water withdrawal among all sectors in the U.S. While not all the water withdrawal is consumed or “lost” via evaporation, the national average water consumption for just one kWh electricity still reaches 1.8L/kWh, even excluding hydropower which itself is a huge water consumer.