Political Calculations
Unexpectedly Intriguing!
31 October 2018

It is All Hallow's Eve once again, and here at Political Calculations, that means taking time out to engage in what some might call pure silliness, but we call a public service. Because on this day, as on Halloweens past, we find the fear in furniture, where everyday objects lose their comfort and charm and instead become something... sinister.

This year's journey into the dark side of seating begins with a truly terrifying design concept: the Terra! Grass Chair. This was a Kickstarter project that combined a cardboard pattern, dirt and grass seed to create outdoor seating. The kind that you can easily hide bodies under.

Terra! Grass Chair

As a general rule, seating is meant to facilitate social activity, where sharing comfort is key. But what if you despise such things?

Perhaps then, you might consider the Delirium Chair, which has been described as "a tribute to a balance between city and nature, artificial and natural, Techne and Psyche", whose "form flows organically but also reminds us of rigid systems like highways".

Delirium Chair

You can see artistic shots of the Delirium Chair interacting with people (or rather, a professional model) here, where if you pay close attention to the design, you'll appreciate its potential application for use in public settings where preventing homeless people from sleeping upon the public's furniture is a desired objective.

Meanwhile, for those who do like to occasionally have friends over, but who don't want to invest a lot in their comfort, the Social Chair Set, which is the result of a student design project, solves the problem of how to have extra chairs when you need them and also how to store them when they're not around.

Social Chair

On the other hand, it does beat the alternative of those folding card table-style chairs!

Now, what if you don't like having people over, but still have to have them anyway? You can sent a strong message about how long they should stay by providing the illusion of playful comfort while keeping them continually off balance with the "Teeter and Tot Chairs".

Teeter and Tot Chair

This furniture concept requires a quote from the student designers:

Teeter and Tot look unsuspecting as chairs. When sat on, they unexpectedly collapse. George Duan and Megan Lee designed Teeter and Tot chairs for office lounge spaces to revitalize risky play in everyday adult lives. The person who sits on Teeter or Tot will experience a sudden, unexpected drop. This falling sensation will inevitably send a strong adrenaline rush through his or her body. Sitting on Teeter and Tot is a chance for adults to experience something unexpected, exciting, and fun in their mundane office lives.

We carefully designed the structure of these chairs to ensure the safety of users, while creating risky, fun experiences for them. Through several prototype iterations, we adjusted the mechanism of the chairs to drop in height just enough to surprise users, but not fall off the chair. Teeter and Tot are upholstered from head to toe with soft materials to absorb the shock from the drop while providing another layer of safety protection for users.

Our final chair of terror is a futuristic concept that makes it possible for you to sit anywhere. Meet the Lex Bionic Chair, which isn't a chair so much as it is a fanny pack that stows two wearable legs that allow you to sit anywhere there isn't a chair, with perfect posture. Watch the video....

Believe it or not, this is the "most funded exoskeleton on Kickstarter". The future of furniture has arrived, and it's just as terrifying as we all have hoped!

Labels:

30 October 2018

Since it peaked at an estimated $24.84 trillion on 20 September 2018, the S&P 500 (Index: SPX) has lost about $2.557 trillion in its total market capitalization through the close of trading on Monday, 29 October 2018, bringing its market cap down to $22.28 trillion, which is below the level where it closed 2017.

S&P 500 Total Market Capitalization, 29 December 2017 through 29 October 2018

Just five firms account for more than $1 out of every $5 of the total net loss of market cap that the S&P 500 has seen since 20 September 2018: Amazon (Nasdaq: AMZN), Alphabet A and C (Nasdaq: GOOGL and GOOG), Facebook (Nasdaq: FB), and JP Morgan Chase (NYSE: JPM). The following chart illustrates the disproportionate bite that the market cap losses in these five firms has taken out of the S&P 500's net market cap loss over the last five weeks.

Net Loss in Market Capitalization of S&P 500 Component Firms, 20 September 2018 through 29 October 2018

The really astounding thing is that much of this loss has come just during the last two days of trading!

Investors who hold a larger share of these individual stocks in their investment portfolios than their component weighting in the S&P 500 index are feeling more pain than those who invested in the index itself in the market's latest correction. In particular, Amazon's decline has been particularly noteworthy, as the company has gone from becoming the second in history to touch a trillion dollar market cap in nominal terms to now losing its place as the second-most valuable company in the U.S. in just the matter of five weeks, as the company has shed $200 billion of its market cap.

Labels: , ,

29 October 2018

Before we get started about what happened in the U.S. stock market during the fourth week of October 2018, let's consider some wisdom from JC at AllStarCharts:

When the stock market is not going up, the blame game gets played. It’s a combination of shareholders losing money and media types needing something to say. It’s always someone or something’s fault and rarely described as a normal occurrence. The truth, however, is that yes, stocks falling in price is part of the regular cycles that we’ve always seen.

We've had a month with a lot of interesting days, where we define "interesting" as happening whenever the S&P 500 changes by more than 2% from its previous day's closing value. In the past month, most of the days that have been interesting have involved stock prices moving in the downward direction. And we'll tell you right now that JC is right. There was nothing abnormal about what happened during the past month.

In fact, once the S&P 500 began its latest Lévy flight event [1], where stock prices would go when the event was complete was completely predictable. The good news is that it's now mostly over, with investors appearing to have completed shifting their forward-looking attention from the near term future of 2019-Q1 toward the more distant future quarter of 2019-Q3, where today's stock prices are now in line with the trajectory that our dividend futures-based model projected would be the case if investors were to do that.

Alternative Futures - S&P 500 - 2018Q4 - Standard Model - Snapshot on 26 Oct 2018

How long they keep their attention focused on that particular point on the future horizon is another matter. We don't know how long that might be. All we can tell you is that given the nature of how stock prices work, they're now likely within just a few percent of setting a bottom, if they haven't already, absent some fundamental deterioration in the expectations for dividends in the future or some noise event that prompts an irrational reaction among investors.

If the latter, then you'll have quite an investing opportunity and if the former, buckle up - you'll know which scenario applies from the news. Speaking of which, here are the headlines we noted for their market-moving potential during the fourth week of October 2018.

Monday, 22 October 2018
Tuesday, 23 October 2018
Wednesday, 24 October 2018
Thursday, 25 October 2018
Friday, 26 October 2018

Elsewhere, Barry Ritholtz succinctly summarized the week's markets and economics news. We'll be back next week to cover the fifth and final week of October 2018, which will also cover the first two days of November 2018!

Notes

[1] Lévy flight events are what give changes in stock prices their fat-tailed distribution characteristic, which distinguishes how they behave from what would happen if they followed a true random walk as described by a normal Gaussian distribution. We observe these events whenever investors shift their forward-looking focus from one point of time in the future to another, where the difference in the expected change in the rate of growth of dividends per share between the two periods determines how much stock prices might change when they occur.

Labels: ,

26 October 2018

"All happy families are alike; each unhappy family is unhappy in its own way."

- Leo Tolstoy, Anna Karenina

While Tolstoy's insight into human nature applies specifically to families, it may also describe the state of many of the world's cities and the division between areas that have been formally developed, where people have easy access to obtaining life's necessities, and areas that have been informally developed, which are often defined by their lack the same kind of access.

At least, that's one takeaway from a new study by Christa Brelsford, Taylor Martin, Joe Hand and Luís M. A. Bettencourt, who found that when they applied the mathematics of topology to the layout of developed neighborhoods and undeveloped slums found in many of today's cities around the world.

Topology is the branch of mathematics that studies the properties of an object that are preserved when it is morphed into very different shapes without either tearing the object or fusing portions of the object back into itself. Think of a ball of clay, which could be shaped into a cube, a bowl or even into a leafy tree, where a wide variety of complex geometries can be grouped into families described by one simple base shape because they share certain unique mathematical characteristics. Because they share those characteristics, it becomes possible to compare very different or complex geometries and arrive at valid conclusions that would then apply to all potential members of the same topological family. Here's how they authors applied the math of topology to the study of urban areas:

The physical volume of all paths, streets, and roads in a city is a connected two-dimensional surface: Any point on this surface can be reached from any other point on the same surface. This surface ends where buildings begin and at external city boundaries. Thus, the urban access network surface, U, of any city has a number of internal boundaries, b, one for each city block and another for city limits. Mathematically, such an access system is topologically equivalent to a disk with b punctures or "holes" (or a sphere with b + 1 disks removed). In this way, all urban access systems with the same number b of city blocks (i) are topologically equivalent and (ii) share an invariant number, the Euler characteristic χ(U) = 1 - b, which is independent of geometry.

The authors found they could identify similarly developed but geometrically and geographically distinct regions as mid-town Manhattan, the Las Vegas suburb of Summerlin, Nevada, and the Dharavi neighborhood of Mumbai, India, and directly link them so long as they shared the same number of blocks, as shown in the following video.

They found that all these formally developed areas were all similar in their characteristics to one another, but the undeveloped areas either within or on the periphery of these cities were quite different from these "happy" communities, where "happy" means "having adequate access to roads, water and sanitation services". Moreover, their approach in mathematically describing slum areas without such similar access to life's necessities may lead to their effective development, with minimal cost and disruption to lives of the one billion people who live in these communities around the world today.

Here's how the authors foresee their new topographical tools being used to address the problem of bringing necessities into today's impoverished urban areas.

Over the next couple of decades, it is estimated that infrastructure investments will need to exceed $1 trillion/year in developing nations to meet international development goals, with the majority in poor areas of developing cities. Slum upgrading is a key strategy for achieving these goals, with infrastructure costs accounting for about 50% of the total. Efficient reblocking is an essential part of these transformations because the most important determinant of the cost of building or upgrading urban infrastructure is the existence and layout of the access network. By identifying and formalizing the essence of the spatial transformations necessary for neighborhood evolution, the methods proposed here increase the benefit-cost ratio for infrastructure provision [currently ~3 for water and sanitation] and markedly accelerate — from months to minutes — most technical aspects of creating viable reblocking plans. This enables nontechnical stakeholders to focus their time and effort on the socioeconomic tradeoffs of alternative layouts [leading to savings of up to 30%] and creates precise digital maps that can formalize land uses and property records, facilitating political and civic coordination and further local development.

In other words, the problem of how to transform impoverished areas into developed ones will stop being a technical challenge and will instead become purely a political problem. As for its potential, imagine unlocking what Hernando de Soto described as the problem of "dead capital" that would come from bringing formal property rights to these undeveloped urban areas and the one billion people who live in them.

References

Brelsford, Christa. Martin, Taylor. Hand, Joe., Bettencourt, Luís M.A. Toward cities without slums: Topology and the spatial evolution of neighborhoods. Science Advances. Vol. 4., No. 8., eaar4644. DOI: 10.1126/sciadv.aar4644. 29 August 2018.

Labels: , , ,

25 October 2018

2018 hasn't been very kind to U.S. homebuilders (Indices: ITB, PKB, XHB), particularly as investor expectations that the Fed will continue its series of interest rate hikes well into 2019 has taken hold during the last month.

But that's looking forward into the future. Present day data is also signaling that new home sales in the U.S. is decelerating, largely as a consequence of rising mortgage rates coinciding with the Fed's recent series of rate hikes, continuing a trend that we first identified in June 2018, shortly after new home sale prices hit their all time peak level of unaffordability.

Since then, the median sale price of a new home in the United States has been flat to slightly falling while median household incomes have continued to rise, leading to an improvement in the affordability of new homes, which never-the-less are still well elevated near their all-time highs for unaffordability.

Ratio of Trailing Twelve Month Averages for Median New Home Sales Prices and Median Household Income | Annual: 1967-2017 | Monthly: December 2000 - September 2018

That new home sale prices have stalled out since April 2018 is evident in the following chart, which presents the same data, but this time, showing the relationship between the trailing year average for median new home sale prices versus the trailing year average of median household income.

U.S. Median New Home Sale Prices vs Median Household Income | Annual: 1999-2017 | Monthly: December 2000 - September 2018

Let's next take a look at all the data we have available for this relationship, which extends back to 1967, to see how the recent trend compares with historic trends.

U.S. Median New Home Sale Prices vs Median Household Income | Annual: 1967-2017 | Monthly: December 2000 - September 2018

While the growth of new home sale prices have been stagnant in recent months, the sales of new homes has been growing, albeit at a slowing pace. In the following animated chart, we've captured the combined effect by calculating the effective market capitalization for new homes in the U.S. for the period from December 1975 through September 2018, where the animation will cycle between the nominal market cap and the inflation-adjusted market cap every five seconds.

Animation: Trailing Twelve Month Average New Home Sales Market Capitalization, Nominal and Inflation Adjusted, December 1975 - September 2018 (Corrected)

The data for recent months is still preliminary, and therefore subject to change over the next several months, but it suggests that the trailing twelve month average of the U.S. new home market capitalization reached a peak of $20.33 billion in nominal terms in August 2018, which has since dipped back below $20 billion in September 2018.

Taken together, all these signs are pointing to a new home market that is stalling.

Data Sources

U.S. Census Bureau. Median and Average Sales Prices of New Homes Sold in the United States. [Excel Spreadsheet]. Accessed 24 October 2018.

U.S. Census Bureau. New Residential Sales Historical Data. Houses Sold. [Excel Spreadsheet]. Accessed 24 October 2018.

U.S. Department of Labor Bureau of Labor Statistics. Consumer Price Index, All Urban Consumers - (CPI-U), U.S. City Average, All Items, 1982-84=100. [Text Document]. Accessed 11 October 2018.

Sentier Research. Household Income Trends: July 2018. [PDF Document]. Accessed 25 September 2018. [Note: We've converted all data to be in terms of current (nominal) U.S. dollars, and are using a projection for September 2018's estimate in calculating the trailing year average for the month.]

Labels:

24 October 2018

Law professor Steve Vladeck recently made some noise when he proposed expanding the U.S. House of Representatives in a recent op-ed piece, where he argued in favor of increasing the number of elected representatives from today's 435, where each member represents about 710,000 Americans, up to 650, where each would represent aroung 500,000.

... unlike changing the equal representation of the Senate, changing the size of the House can be done simply by statute; the change to the Electoral College would follow automatically. As for how many seats to add without creating an unmanageable legislative body, a good starting point might be to reduce the average size of House districts by one-third — to 500,000 residents per representative.

Given estimates that the U.S. population today is roughly 325 million, that would mean expanding the House to roughly 650 seats — coincidentally, the exact size of Britain’s House of Commons. (Another proposal is the so-called “Wyoming rule,” which would peg the average size of House districts to the total population of the least-populous state, i.e., 563,626, under the 2010 Census.)

The United Kingdom has a population of about 67 million people today, about one-fifth the population figure he cites for the United States, so that's not exactly a compelling argument for improving representative democracy so much as it is an exercise in mystic numerology.

But since Vladeck claims that having a lower people-per-representative ratio would lead to an "unmanageable legislative body", let's start with the following argument he made in support of his position to find out how well founded it is:

There are three major problems with Congress’s refusal to increase the size of the House. First, and most obviously, it is difficult — if not impossible — for any one person adequately to represent the interests of three-quarters of a million people. A member of Congress who somehow finds a way to meet with 1,000 different constituents every single day would still not see everyone they represent over the course of a two-year term.

We wondered how big of a problem that personal interaction with an elected representative really might be, so we extracted historical population data from each U.S. Census from 1790 through 2010, the most recent year the decennial Census has been conducted and also the number of representatives that have been set as a result so we could calculate the changing population-to-representative ratio over time. We also recognized that if a representative were unable to engage with their constituents to the degree established at the nation's founding, that they would be likely to hire personal staff to assist them for that purpose, so we collected data for the size of the personal staff for each representative's office over these years.

The first chart below shows what we found for the total numbers of elected representatives and their paid personal staff members who would be available to engage with members of their congressional districts for the census years of 1790 through 2010, which would cover the total through this point of time in 2018.

Number of Elected Members of House of Representatives and Their Personal Staffs, 1790-2018

This chart presents an interesting point. Before 1893, elected members of the U.S. House of Representatives were not permitted to hire people to serve on their personal staffs using taxpayer dollars. After 1893, there were allowed to hire one person to fill that role, which lasted until 1940, when they could expand their staffs to hire three people. After World War 2, that doubled to six, which lasted until 1966, when the number of personal staff allowed per representative doubled again to 12. In 1975, that figure rose to 18 full time employees and in 1979, that number was again augmented by allowing the hiring of an additional four part time workers, which we've treated as the equivalent of two full time workers in our chart.

How does that affect the ratio of constituent contact with their elected representative's office? That answer is shown in our next chart, where we've indicated the number of a district's population that is served per member of their elected representative's office, as made up by the elected representative and their paid personal staff, as well as by the representative himself, if he were completely unassisted.

Number of Americans Served per U.S. House of Representatives Office Employee (Elected Member and Their Personal Staffs), 1790-2018

In 1790, we calculate that a single elected member of the U.S. House of Representatives would have served approximately 37,421 Americans from their home congressional district, without the aid of any paid personal staff. Through the 2010 census year, we find that a single employee of an elected U.S. Representative's office would serve about 33,798 Americans each, nearly 10% fewer than what an unassisted Representative could achieve on their own in 1790. It is also far less than the ratio of 177,000 to 1 of population-to-representative that was reached in 1890, prior to the decision of the U.S. House of Representatives to allow each member to hire personal staff to assist them in serving their congressional district's needs.

If we ignore the contribution of the members' personal staffs in assisting them, then the ratio has grown from 37,421 per representative in 1790 to 709,760 per representative through the 2010 census year. Focusing on that number however is really a bit of misdirection on Vladeck's part, because of the role that representatives' personal staffs have in facilitating communication between the elected representative and their constituents.

Closing out Vladeck's other two perceived "major" problems, the second one has do with the unequal distribution of the U.S. population among individual states which makes it difficult to exactly slice and dice congressional districts so that each cover identically sized populations, although they are currently approximately equal within each state regardless of its relative population to other states. Vladeck's third problem is more about his being upset that states with small populations end up with a disproportionate say in the Electoral College under the geographic affirmative action program established by the authors of the U.S. Constitution to reduce discrimination against dispersed rural populations, where he argues in favor of expanding the political power of more urbanized, higher population regions to influence the selection of the U.S. President by expanding the House of Representatives to their benefit.

We've ambivalent on whether increasing the number of members of the House of Representatives is a worthwhile consideration, but if the point of the move is to improve the ability of congressional district residents to interact with their representative's office, there's not much to be gained from the proposal that Vladeck has put forward. We might be more positively inclined to support such a change if the pay of both elected representatives and their personal staffs were slashed proportionately according to their reduction in number of residents served, along with the number of paid personal staff members, since there would clearly be less work for them to do. Plus, we know exactly where they could meet in Washington D.C. if the U.S. Congress ever got really serious about expanding the number of representatives to match the original intent of having no fewer than 30,000 Americans for each U.S. Representative to represent were to be truly realized....

Data Sources

U.S. Census Bureau. Decennial Census of Population and Housing. [Online Database]. Accessed 20 October 2018.

Wikipedia. United States Congressional Apportionment. [Online Article]. Accessed 20 October 2018.

Sourcewatch. Congressional offices and staff. [Online Article]. Accessed 30 May 2015.


Labels: ,

23 October 2018
Mega Millions 23 October 2018 Jackpot

Are you thinking about buying a 1 in 302,575,350 chance to win a $1.6+ billion jackpot in the Mega Millions lottery today?

Mega Millions jackpot enters ‘uncharted territory’ at record $1.6 billion

Yes, that headline is correct: The Mega Millions jackpot is at $1.6 billion. With a B.

All eyes were on the latest drawing Friday night, when the jackpot was at an already mind-boggling $1 billion. However, with no ticket matching all six numbers drawn — 15, 23, 53, 65, 70 and Mega Ball 7 — the grand prize now swells to $1.6 billion....

The estimated cash option for a $1.6 billion jackpot — should a winner choose to take a one-time lump sum payment instead of annual payouts over 30 years — is about $905 million, according to Mega Millions officials.

That's quite a lot of money, which will very likely entice a lot of Americans who normally don't consider playing the lottery into the mix for the grand prize in tonight's Mega Millions lottery drawing. To give you an idea of how stiff the competition will be in the Tuesday evening drawing, about 57% (or 172,467,950) of the 302,575,350 possible combinations of numbers that could be played in the Friday, 19 October 2018 drawing were played, when only a $1 billion jackpot was up for grabs. And now with an even bigger prize at stake, it would be a safe bet that an even larger percentage of all the possible number combinations will be played in the next drawing.

But is the jackpot large enough to be worth your $2.00 to buy one chance to pick win the jackpot? Or to be worth sharing in winning the jackpot if there is more than one player who has also picked the winning numbers?

There are two big factors that factor into the answe to those questions: the odds of winning and the amount of taxes that you would have to pay on the prize you might win. In the following tool, we've plugged in the numbers that are projected to apply for Mega Millions' record high jackpot drawing on Tuesday, 23 October 2018, but you can change it to apply to any lottery drawing. If you're considering playing the Powerball lottery, whose own grand prize in its next drawing on Wednesday, 24 October 2018 will have at least $620 million at stake for whoever beats its 1 in 292,201,338 odds of winning, we another version of this tool set up with its default numbers to answer the same questions. [If you're reading this article on a site that republishes our RSS news feed, please click here to access a working version of the tool.]

Basic Lottery Game Data
Input Data Values
How many possible numbers are there to choose from in the main set?
How many numbers from this set does the player get to pick?
"Powerball" or "MegaBall" Data
How many possible numbers are there to choose from for the Power or Mega Ball?
How many numbers from this set does the player get to pick?
Ticket Cost Data
Cost of a Lottery Ticket
Income Tax Rates
Federal Income Tax Rate (%)
State Income Tax Rate (%)
Local Income Tax Rate (%)

The Odds of Winning the Lottery and the Magic Jackpot
Calculated Results Values
The Odds of Winning the Jackpot (1 in ...)
The Magic Jackpot (The Number That Makes Playing the Lottery Worth the Cost of a Ticket)

In the tool above, we've set the federal income tax rate to 37%, which is the maximum income tax rate that applies for 2018. We've set the state and local income tax rates to 0%, which you'll need to change if you live in a state and/or city and/or county that taxes lottery winnings.

Running the default numbers we've entered, we find that the projected jackpot of $1.6+ billion is greater than the "magic" jackpot of $961 million that would make the Mega Millions lottery worth paying $2 for a ticket to play today. At the same time, because the jackpot is currently less than $1.82 billion (double the tool's magic jackpot figure), it's not worth buying a second ticket to play, nor is it at a level that would make it worth paying $2 to play only to have to share the grand prize with one or more additional winners who might pick the same winning numbers.

On the 1 in 302,575,350 chance that you do win, you'll have some decisions to make about how you want to collect your winnings. Be sure to check out this article to consider the relative merits of either taking the lump sum or of taking the annuity payout over 30 years.


Labels: , , ,

22 October 2018

The S&P 500 (Index: SPX) ended the third week of October 2018 much the same way as it ended the previous week, with investors appearing to split their forward-looking focus between the future quarters of 2019-Q1 and 2019-Q3.

Alternative Futures - S&P 500 - 2018Q4 - Standard Model - Snapshot on 19 Oct 2018

In between, the U.S. stock market continued to show signs of heightened volatility, with fourth quarter earnings pulling investors attention in one direction (upward), and concerns over the frequency and timing of future interest rate hikes by the Fed pulling it in the other (downward).

By the end of the week, investors appeared to be putting a stronger weight on the what U.S. interest rates will be a year from now (60%) than they were upon earnings to come in the nearer term (40%), as visualized by the relative spacing between the actual trajectory of the S&P 500 against the potential trajectories assocated with either 2019-Q1 or 2019-Q3 indicated in our spaghetti forecast chart.

At least, that's what we were able to glean from the market moving news headlines from the week, which provide the context of what influenced investors to go along with what our dividend futures-based model projected during Week 3 of October 2018.

Monday, 15 October 2018
Tuesday, 16 October 2018
Wednesday, 17 October 2018
Thursday, 18 October 2018
Friday, 19 October 2018

Meanwhile, Barry Ritholtz outlined the week's markets and economics news' positives and negatives. There are seven of each, with an disconcerting new entry at the top of the negative column....

Labels: ,

19 October 2018

On 24 September 2018, mathematician Michael Atiyah announced that he believed that he had cracked the Riemann Hypothesis at the Heidelberg Laureate Forum, which if his proof holds, would represent a very big deal. Not just because the proof would come with a one million dollar prize from solving one of the Clay Mathematical Institute's millennium problems, or because it describes the distribution of prime numbers, but also because it would automatically prove a lot of other mathematical hypotheses that rely on the Riemann hypothesis being valid for their contentions to hold.

For more information about what the Riemann Hypothesis is, we recommend viewing either Numberphile's 17-minute video or 3Blue1Brown's 22-minute long video on the topic, both of which are well done but offer different strengths in presentation. What we found interesting in Atiyah's announcement is that he claims the proof came about because of work he was doing (leaked here?) to analytically derive the fine structure constant from physics, which is fascinating in and of itself.

In the following video from the University of Nottingham's Sixty Symbols project, Laurence Eaves provides a blissfully short 5-minute long explanation of the significance of the fine structure constant, which is the measure of the strength of the electromagnetic force governing how electrically-charged elementary particles interact.

The three fundamental constants that are combined in the fine structure constant are:

If we include pi, there are technically four universal constants in the formulation. It's estimated to be very nearly equal to 1 divided by 137, where if it were exactly equal to that ratio, would really be extraordinary because the value 137 happens to also be a Pythagorean prime number. As it happens, the denominator in the fine structure constant is currently estimated to be 137.035999138..., so it's close, but not quite.

Labels: ,

18 October 2018

Not long ago, thanks to a botched drink order, we had our first experience with the new style plastic "adult sippy cup" lids that Starbucks (NYSE: SBUX) announced would replace plastic straws for all its iced beverages back on 9 July 2018.

Starbucks promoted the new lids as a way to reduce the amount of plastic waste that ends up in the world's oceans, where the company's press release featured the comments of environmental activist Nicholas Mallos:

“Starbucks’ decision to phase out single-use plastic straws is a shining example of the important role that companies can play in stemming the tide of ocean plastic,” said Nicholas Mallos, director of Ocean Conservancy’s Trash Free Seas program. “With eight million metric tons of plastic entering the ocean every year, we cannot afford to let industry sit on the sidelines.”

Starbucks' adult sippy cup lids are being first rolled out in the North American cities of Seattle, Washington and Vancouver, British Columbia, even though neither city, nor any city in North America, nor the entire continent of North America, is a major source of plastic waste that enters the world's oceans each year. At present, neither Starbucks nor the Ocean Conservancy has indicated how they will reduce the total volume of plastic trash entering the world's oceans from the places that are primarily responsible for the practice.

It's already been remarked that the new plastic lids use more plastic than the combination of plastic straws and the "flimsier" plastic lids that they are replacing, as can be seen in the following investigative report:

What hasn't yet been much remarked upon is that the thicker plastic used in Starbucks' adult sippy cup lids makes them less effective at their primary purpose: containing the iced beverage within its cup without spilling.

The thicker plastic makes the lid design more rigid than its predecessor, which makes it much easier to pop off the top of the cup with a very light amount of pressure, applied either to the lid or to the upper portion of the body of the plastic cup. The previous flat lid design didn't have that problem, because it was much more flexible, allowing it to remain attached to the cup when handled similarly.

If you get one, try the following experiments for yourself, each of which represents very common ways that these kinds of beverages are handled by consumers, particularly if they are drinking them on the go, such as in a car:

  • Apply a light amount of pressure at a single point on the underside of the lid where it is attached to the cup.
  • Pick the cup up by the lid at several points around the lid and squeeze gently.
  • Hold the plastic cup near the top, but not touching the lid, and squeeze gently.

In each case, you should discover that the lid pops off much more easily than the old-style flat lid design does, greatly increasing the risk of spills. Although our experiments currently involve a sample size of just one cup, where perhaps Starbucks' inattentive baristas [1] randomly provided us with an extraordinarily poor lid/cup combination, our first experience suggests that Starbucks' new adult sippy cup lids are not ready for prime time as they would appear to be less effective as lids than the design they will replace.

The funny thing is that if Starbucks would rather that customers pop the tops off their iced beverages when they consume them, the practice of drinking them on the go be damned, they could have simply eliminated the plastic straws and kept their existing plastic lids, which are actually designed to be used as coasters when consumers choose to drink their iced beverage from a lid-free cup. That's something else at which the Starbucks' new lid design also fails.


[1] Remember, we only got the cup because they botched our drink order, where they chose to provide us with an altogether different beverage than the one we ordered. We chose to go along with it because it came with the new cup and lid design, which we wouldn't have had the chance to play with if they had gotten our order right. For us, it's additional evidence that the company is increasingly troubled, where it is perhaps skimping on employee training as well as product testing as part of a series of poorly considered cost reduction strategies in trying to boost its bottom line.

Labels: ,

17 October 2018

The U.S. Department of Education has updated its statistics on the number of student loan defaults occurring within three years of a borrower beginning repayment. The latest data covers borrowers who began making payments on their student loans during the U.S. government's 2015 fiscal year, which ran from 1 October 2014 through 30 September 2015, who have subsequently stopped making scheduled payments for a prolonged period of time without making arrangements to either defer them or to seek relief from them during the following three years.

The following chart shows the number of student loan borrowers who have defaulted within three years after starting to make payments in each fiscal year from 2009 through 2015 according to the type of four year degree-granting institution they attended, covering proprietary (or for-profit) universities, public universities, and private (or non-profit) universities.

Number of Student Loan Borrowers Who Have Defaulted Within 3 Years of Beginning Repayment, by Institution Type, 2009-2015

For student loan borrowers who attended four-year degree granting institutions and began making payments during the U.S. government's 2015 fiscal year, 531,653 went on to default on their student loan debt over the next three years. Of these, 51% attended public universities, 34% attended proprietary (or for-profit) institutions, and 15% attended private universities.

The next chart shows the percentage of student loan borrowers for each type of institution that began repaying their student loans, but went on to default on them within the following three years.

Percent of Student Loan Borrowers Who Have Defaulted Within 3 Years of Beginning Repayment, by Institution Type, 2009-2015

We find that although public universities account for the largest number of defaulting student loan borrowers, a larger percentage share of students who attend proprietary (or for-profit) institutions have consistently defaulted on their student loans.

The final chart reveals the average annual cost of college tuition and fees for each type of institution, which is based on data from the 2016-17 academic year, which is the closest we could get to 2015 for all three types of four year degree-granting institutions. The $9,670 value shown for public institutions applies for state residents eligible for in-state tuition rates, where the average public university tuition and fees for out-of-state residents is $24,820.

Average Published Undergraduate Charges (Tuition and Fees) by Institution TYpe, 2016-17

Since March 2010, when President Obama signed legislation putting the U.S. government into the student loan business, most of these defaults represent money borrowed from the U.S. government.

Today, about 90 percent of loans made to students and their families are from the federal government, which does not follow the same lending standards required of financial institutions. The federal government holds around $1.4 trillion in student loans on its books and is currently seeing a double digit delinquency and default rate.

10.3% of student loan borrowers who started making payments on their debt to the federal government in 2015 have defaulted. That figure has come down and stabilized over the last several years, but is still a double-digit problem for the demographic group with the lowest rate of unemployment in the U.S. economy. Meanwhile, many who are making payments on their student loans aren't getting very far in paying them down.

Just as alarming as the high number of defaults on federal student loans is that less than half of new borrowers are able to put a dent in their principal balance within three years of entering repayment.

Currently, the U.S. government makes it very difficult, if not nearly impossible for borrowers to have their student loan debts discharged through bankruptcy proceedings, which is pretty much what you would expect for any creditor who has the power to write the rules to benefit itself by forcing the repayment of any loan that it has either guaranteed or issued.

If it cannot be discharged in bankruptcy and the money borrowed is owed to the government, it isn't debt. It's taxes. The sooner that's fixed by transforming student loans back into debt, among other reforms, the better it will be for everyone involved.

References

U.S. Department of Education. Official Cohort Default Rates for Schools, by Institution Type. [PDF Documents: 2009-2011, 2012-2014, 2013-2015]. 26 September 2018. Accessed 13 October 2018.

College Board. Trends in Higher Education. Average Published Undergraduate Charges by Sector and by Carnegie Classification, 2017-18. [Online Document. 17 November 2017.

Labels: , , ,

16 October 2018

It's been several months since we last considered the deteriorating situation for the future of dividends at General Electric (NYSE: GE), where we wrote that either GE or its dividend, and quite possibly both, were set to shrink.

Four months and one CEO later, and that statement holds even more true today, where we believe that it is no longer a question of "if", but of "when" and "by how much".

On 12 October 2018, GE announced it would delay the release of its fourth quarter financial statements until the end of the month to allow the company's new CEO, Larry Culp, to complete his "initial business reviews and site visits".

At the same time, we're coming up on the one year anniversary of when GE's then new CEO, John Flannery, slashed its quarterly dividend in half, from $0.24 per share to $0.12 per share, so it's a good time to look at what's changed for investors, which we can do by looking at just one number: the company's market capitalization.

When GE declared its last quarterly dividend of $0.24 per share on 7 September 2017, GE's market cap was nearly $208 billion. Exactly one year later, when declaring its fourth quarter dividend payment for 2018, GE's market cap was $108 billion, which is pretty close to where it stands today, just over a month later. That missing $100 billion goes a long way toward explaining why GE now has a new CEO.

That decline also gives us an indication of how much GE's new manager may be looking to cut the company's dividend. The following chart shows the relationship between GE's market cap and its aggregate dividend payouts for each quarter since 12 June 2009.

General Electric Market Capitalization versus Forward Year Aggregate Dividends at Dividend Declaration Dates from 12 June 2009 through 7 September 2018

Given the historical relationship captured in the chart, at GE's current day market cap of $108 billion, we would anticipate a 35-40% reduction in the size of the company's dividends, from $0.12 per share to about $0.07 per share, which for all practical purposes, is already baked into the company's average share price over the five weeks. Culp could announce this change today and there would be minimal impact to the company's stock price.

But, that 35-40% lower dividend payment is for a General Electric that still has its health care division, which it has been planning to spin off. Without it, GE will need to cut its dividend by more than that percentage, because it won't have the revenues, earnings, and cash flow that it provides to the company to sustain a dividend reduced by only 35-40% from today's level.

We think then that it's very likely that once the new CEO's review of the company's operating and financial situation is complete, GE will suspend its dividend altogether. The change would help preserve what has been the company's increasingly distressed cash flow, which some analysts have indicated is not sufficient to cover both GE's current quarterly dividend of $0.12 per share and its operating requirements.

Other analysts believe that GE's dividend is safe. Based on the information we have today, we are not in that camp.

References

Dividend.com. General Electric Dividend Payout History. [Online Database]. Accessed 14 October 2018.

Ycharts. General Electric Market Cap. [Online Database]. Accessed 14 October 2018.

Yahoo! Finance. General Electric Company Historical Prices. [Online Database]. Accessed 14 October 2018.

Labels: ,

15 October 2018

The second week of October 2018 was one of the most interesting for the S&P 500 (Index: SPX) in quite some time.

For our purposes, to qualify as interesting, the S&P 500 needs to change in value by 2% or more on any given day, and Week 2 of October 2018 had two days that met that simple threshold, Wednesday, 9 October 2018, when it fell by more than 3% and Thursday, 10 October 2018 when it continued to fall by another 2%.

Alternative Futures - S&P 500 - 2018Q4 - Standard Model - Snapshot on 12 Oct 2018

In doing so, investors moved the market by shifting their forward-looking focus from 2019-Q1 toward the more distant future of 2019-Q3. Through the end of the week on Friday, 12 October 2018, the level of the S&P 500 in our dividend futures-based model indicates that investors are splitting their attention between these two quarters, putting a slightly heavier weighting on the nearer term future of described by the expectations associated with 2019-Q1 than they are for 2019-Q3.

Why focus on 2019-Q3 at all? Starting with basic fundamentals, the dividend futures for the S&P 500 associated with this upcoming quarter have been flat at $14.00 per share since 12 July 2018, where they represent a deceleration in the year-over-year rate of growth of trailing year dividends per share for the index. As such, the expectation that the rate of growth of S&P 500 dividends, and by extension, U.S. economic growth, will slow during this future quarter has been baked into investor considerations for quite some time.

Since there has been no change in these fundamental expectations, we can rule out any growing fear of a stalling stock market and economy at this time as a causal factor behind the S&P 500's latest Lévy flight event. Ditto for any news items related to the ongoing low-level trade war between the U.S. and China, which for all the noise in the news it has generated since it began earlier this year, has not produced a noticeable impact on dividend futures to date.

What has changed very recently however is investor expectations of a Fed rate hike in 2019-Q3, which would explain why investors would focus on this particular point of time in the future at this point of time in the present. With the U.S. economy continuing to grow strongly, and with the lowest official unemployment rate since the late 1960s, the probability that the Fed will continue its series of interest rate hikes in this quarter has very recently surged above 50% according to interest rate futures. This change in investor expectations has also coincided with a period of heightened volatility in intraday stock price values, which peaked with the U.S. stock market's Lévy flight event on 10 October 2018. The following chart shows the probabilities indicated by the CME Group's FedWatch tool for upcoming rate hikes as of the close of trading on 12 October 2018.

CME Group Fedwatch Tool - Probabilities of Change in Federal Funds Rate - Snapshot on 12 Oct 2018

We see that investors are currently giving better than even odds of a quarter percent or greater rate hike occurring when the Fed meets on 19 December 2018 (2018-Q4), then again on 20 March 2019 (2019-Q1), and again on 18 September 2019 (2019-Q3).

Since the week's Lévy flight event prompted us to feature a special mid-week edition of our weekly S&P 500 Chaos series, we only have two additional days of news headlines to catch up on. Here's the rest of the week's market-moving news....

Thursday, 11 October 2018
Friday, 12 October 2018

Looking for a bigger picture of the week's major economics and markets news? Barry Ritholtz' has outlined the week's positives and negatives. As a special bonus, he also wrote one of the better analytical pieces of the week: The Stock-Market Meltdown That Everyone Saw Coming! Absolutely essential reading, and the only other piece we know of that even mentioned properly mentioned random walks in describing what happened with stock prices in the aftermath of the week's Lévy flight event.

Labels: ,

About Political Calculations

Welcome to the blogosphere's toolchest! Here, unlike other blogs dedicated to analyzing current events, we create easy-to-use, simple tools to do the math related to them so you can get in on the action too! If you would like to learn more about these tools, or if you would like to contribute ideas to develop for this blog, please e-mail us at:

ironman at politicalcalculations

Thanks in advance!

Recent Posts

Indices, Futures, and Bonds

Closing values for previous trading day.

Most Popular Posts
Quick Index

Site Data

This site is primarily powered by:

This page is powered by Blogger. Isn't yours?

CSS Validation

Valid CSS!

RSS Site Feed

AddThis Feed Button

JavaScript

The tools on this site are built using JavaScript. If you would like to learn more, one of the best free resources on the web is available at W3Schools.com.

Other Cool Resources

Blog Roll

Market Links

Useful Election Data
Charities We Support
Shopping Guides
Recommended Reading
Recently Shopped

Seeking Alpha Certified

Archives