Political Calculations
Unexpectedly Intriguing!
31 July 2018

On Thursday, 26 July 2018, the shareholders of Facebook (NASDAQ: FB) experienced the worst ever rout of an individual stock in stock market history, as they collectively lost over $119 billion in just one day.

Facebook on Thursday posted the largest one-day loss in market value by any company in U.S. stock market history after releasing a disastrous quarterly report.

The social media giant's market capitalization plummeted by $119 billion to $510 billion as its stock price plummeted by 19 percent. At Wednesday's close, Facebook's market cap had totaled nearly $630 billion, according to FactSet.

No company in the history of the U.S. stock market has ever lost $100 billion in market value in just one day, but two came close.

The two that came close were Intel (NYSE: INTC) and Microsoft (NASDAQ: MSFT), both of which got monkey-hammered in separate one-day crash events back in 2000. CNBC put together the chart below showing the Top 12 market cap crashes for individuals stocks. It's a well-designed, highly informative chart, but if you look closely, you'll see that all of the recorded Top 12 market cap crash events have taken place during a period spanning just the last 18 years.

CNBC: Market cap losses in big cap stocks

You don't have to be either stockholder or a consumer of Campbell's Soup (NYSE: CPB) to know both that the U.S. stock market has been around a lot longer than that and also that there's been a bit of inflation over that period of time. What would happen if we adjusted the largest one-day market cap losses for these individual stocks to account for that factor?

And then we realized that would be a pretty worthless exercise. If we were to do so, we would have to choose which inflation measure to do the adjustment (Consumer Price Index? Personal Consumption Expenditures? GDP Deflator? Tomato Soup Standard?) and also for what period of time to set the value of a dollar (should we use constant July 2018 U.S. dollars? January 2000? How about an average over the year 2000? Why not split the difference and choose some other point of time somewhere in the middle?).

Now, what some other company's stock experiences another record-setting one-day crash event several years from now, and we have to make all those choices and do all that math all over again?

The nice thing about the kind of nominal data shown in CNBC's chart is that it doesn't pose these kinds of problems. Those are the values that have direct meaning at the time at which the events occurred. What we really want is a better way to describe the magnitude of these high dollar one-day market cap crash events that directly captures how much of a bite they took out of the stock market on the days they occurred.

It occurred to us that the best way to capture that kind of information with nominal data is to directly calculate the impact of the individual stock's market cap crash on the whole stock market, which for our purposes, would mean dividing the individual stock's market cap loss by the value of the market capitalization for a major stock market index that includes the individual stock as a component to find its relative magnitude as a simple percentage of the total valuation of the index.

Since CNBC's Top 12 list is made up of large cap stocks, it would make sense to use the market cap of the S&P 500, since each listed company is a component of this market-cap weighted index and the index represents about 80% of the capitalization of the entire U.S. stock market. If you understand the 80-20 rule, you know why that matters....

What this does is give us an objective measure that doesn't require needlessly repetitive and not terribly worthwhile calculations to tell how big of a market rout such a one-day crash event may have been. In fact, we can immediately compare the relative magnitude of a given event with others to tell which would have a larger impact in the terms of the stock market of its day.

We've done that math for CNBC's Top 12 market cap crash events for individual stocks in the following dynamic table, which you can sort from either high-to-low or from low-to-high by clicking the various column headings. What's more, if you insist on putting a particular day's dollar value on the magnitude of a market cap crash event, you can by simply multiplying the percentage we've calculated by the applicable market cap for the S&P 500 index (we've estimated those values with respect to the S&P 500's recorded market cap at the beginning of July 2018).

If you're accessing this article on a site that republishes our RSS news feed, please click through to access a working version of the dynamic table. If not, we've added an extra column with the "Adjusted Rank" to make it easy to tell how each company stock's market-cap crash ranked in terms of how big a bite it took out of the S&P 500 total market cap.

Largest One-Day Market Cap Losses for Individual Stocks Since January 2000
Nominal Rank Company (Ticker) Date of Loss Largest One Day Market Cap Loss As Percent of S&P 500 Market Cap Adjusted Loss (for July 2018 S&P 500 Market Cap) Adjusted Rank
1 Facebook (NASDAQ: FB) 26 Jul 2018 $119,419,310,000 0.48% $119,419,310,000 4
2 Intel (NYSE: INTC) 22 Sep 2000 $90,736,960,000 0.76% $189,950,184,825 1
3 Microsoft (NASDAQ: MSFT) 3 Apr 2000 $80,024,600,000 0.64% $159,693,090,470 2
4 Apple (NASDAQ: AAPL) 24 Jan 2013 $59,633,680,000 0.47% $117,297,800,738 5
5 Exxon Mobil (NYSE: XOM) 15 Oct 2008 $52,511,380,000 0.58% $146,531,232,794 3
6 General Electric (NYSE: GE) 11 Apr 2008 $46,914,750,000 0.38% $96,029,919,403 7
7 Alphabet (Google) (NASDAQ: GOOGL) 2 Feb 2018 $41,074,840,000 0.17% $42,719,042,603 10
8 Bank of America (NYSE: BAC) 7 Oct 2008 $38,486,790,000 0.40% $101,412,985,020 6
9 Amazon (NASDAQ: AMZN) 2 Apr 2018 $36,477,450,000 0.16% $40,641,561,536 11
10 Wells Fargo (NYSE: WFC) 5 Feb 2018 $28,905,810,000 0.12% $30,062,893,218 12
11 Citigroup (NYSE: C) 23 Jul 2002 $25,941,230,000 0.29% $71,520,350,626 8
12 JPMorgan Chase (NYSE: JPM) 29 Sep 2008 $24,884,180,000 0.23% $57,118,598,093 9

After adjusting for the relative impact on the S&P 500 in terms of the index' total market capitalization, we find that Facebook's nominal single-day record decline actually ranks fourth at 0.48% when considering the impact of the stock price drop upon the U.S. stock market as represented by the S&P 500 index. We find that Intel's 22 September 2000 crash claims the top spot with an adjusted market loss of 0.78% (or nearly $190 billion in terms of July 2018's S&P 500 market cap), followed by Microsoft's 3 April 2000 decline of 0.64% (nearly $160 billion) in second place, and Exxon Mobil's 15 October 2008 drop of 0.58% (over $146 billion) coming in third, all of which representing a bigger relative bite out of the stock market of their day than what Facebook's nominal record-setting $119 billion loss of the company's market valuation did on 26 July 2018.

And that's covering the biggest one-day market cap crashes for individual stocks since January 2000! If we went further back in time, who knows what the real all-time one-day market cap crash champion would turn out to be! If you have the individual stocks market cap loss and the total market cap for the S&P 500 index on the day(s) before the individual stock's crash event, the math is easy, the results are clear, and any time that you have a new event, you would simply insert the results of the math into the appropriate row of the ranking table.

References

Imbert, Fred and Francolla, Gina. Facebook's $100 billion-plus rout is the biggest loss in stock market history. CNBC. [Online Article]. 26 July 2018.

Standard & Poor. S&P Market Attributes Web File. [Excel Spreadsheet]. 29 June 2018. Accessed 29 July 2018.

McMillan, Alex Frew. Year of the big stock drop. CNNMoney. [Online Article]. 12 October 2000.

Labels: ,

30 July 2018

It's not often that we can summarize what happened to the S&P 500 during the course of a week in terms of a single stock, but the record-setting one-day decline in the market's valuation of Facebook (NASDAQ: FB) made the story of that single company's fall the major market moving news for the entire S&P 500 for the fourth week of July 2018.

To understand why a bad day for a big tech company like Facebook had such an impact on the S&P 500 index, let's revisit a chart we presented earlier this year, where we showed the relative share of a number of big tech firms within the S&P 500 index.

Big Tech in the S&P 500, 28 March 2018

That chart was based on the market data that was available back on 28 March 2018. Flashing forward to Wednesday, 25 July 2018, the day before the company would set the nominal all-time record for a one-day crash in the market cap of its stock, SlickCharts reported that Facebook's share of the total market capitalization of the entire S&P 500 had risen from the 1.64% shown in the chart above to 2.16% over the previous four months. Which is to say that a little over $1 out of every $50 invested in the companies that make up the S&P 500 was invested in shares of Facebook.

By the end of trading on Friday, 27 July 2018, investors had reduced the value of Facebook shares back down to 1.76% of the entire market cap valuation of the S&P 500, after the company had endured two days of losses.

For the S&P 500, the bad news for Facebook sucked a good portion the air out of the room for what had been shaping up to be a good week, but not all of it, where positive news reported for other S&P 500 firms and for their business outlook boosted the overall index for the week.

Alternative Futures - S&P 500 - 2018Q3 - Standard Model - Snapshot on 27 Jul 2018

Overall, the week was relatively quiet on the news front, with investors largely focused on the busiest reporting week of 2018-Q3's earnings season for the S&P 500's component firms. We noted the following headlines as being of interest during Week 4 of July 2018.

Monday, 23 July 2018
Tuesday, 24 July 2018
Wednesday, 25 July 2018
Thursday, 26 July 2018
Friday, 27 July 2018

If you would like to get a bigger picture of the U.S. markets and economy's news for Week 4 of July 2018, be sure to check out Barry Ritholtz' outline of the week's news, as grouped into its positives and negatives.

And that's the news of the week that was. If you're still hungry for more, we're going to take a closer look at Facebook's record-setting crash in the very near future, where we'll redefine how the records for one-day crashes for individual stocks should be measured.

Labels: ,

27 July 2018

To say that the City of Philadelphia has a problem with managing the money that it collects from all the taxes it imposes on metropolitan residents and visitors, such as its highly controversial soda tax, may be an understatement. The city's new controller, Rebecca Rhynhart, has turned up some stunningly large discrepancies in the municipality's accounts.

Philadelphia has the worst accounting methods of the nation’s 10 largest cities, according to an audit released on Tuesday by Philadelphia city controller Rebecca Rhynhart.

The evidence, according to Rhynhart: The city has lost track of $33 million from its largest cash account, and it has made $924 million in accounting errors in the last fiscal year alone.

If it helps, the following video illustrates what $30 million looks like:

Now, just imagine those bills stacked 10% higher than what's shown in the video, and you've visualized how much money has gone completely missing from Philadelphia's city coffers without a trace!

Here's the kicker. In March 2018, Philadelphia Mayor Jim Kenney proposed increasing the city's property taxes and a number of others with the goal of collecting an additional $980 million over the next five years. Although Kenney has since slightly scaled back his proposed tax hikes, the city expects that it would collect an additional $900 million over the next five years from them.

If you lived in Philadelphia, would you vote to force yourself and your neighbors to give a city government with a history of such sloppy bookkeeping nearly a billion more dollars of revenue from higher taxes? Especially a city government that can't seem to find any trace of $33 million that it previously collected?

Thanks in large part to the lack of urgency that the city's government has approached its accounting problems, the mystery of Philadelphia's missing millions has become a fraud investigation:

City Controller Rebecca Rhynhart announced Tuesday a fraud investigation into the roughly $30 million in unaccounted-for-dollars the Kenney administration is trying to track down and criticized Finance Director Rob Dubow and her former boss Mayor Jim Kenney for a lack of urgency in correcting Philadelphia's accounting practices.

"The tone and the attitude regarding the urgency of fixing this needs to change and that needs to start with the mayor making it a significant priority," Rhynhart said during her first press conference since her inauguration in January.

"At best, this $33 million is several undocumented transactions, or money transferred into the wrong account. At worst, there’s the inducement for wrongdoing here," she said while discussing the Controller Office's annual internal control and compliance report, which was released Tuesday morning....

"We need to treat internal controls with the importance that taxpayers deserve," she said. "At a time when taxes are going up and assessed valuations are going up, we need to more than ever look at how the city itself is running its shop."

Indeed. If only more elected and appointed city officials had been doing that in all these years before.

Update 28 November 2019: The city has found most of the missing money!

Last week, accountants figured out that seven debt payments totaling $21 million had been paid from the wrong account, throwing off the city's books.

The $21 million is part of the $33.3 million in unaccounted-for funds reported earlier this year. It wasn't clear at the time whether the money was missing or misreported as the result of accounting mistakes. City Treasurer Rasheia Johnson said Tuesday that it was the latter....

Aside from the $21 million in seven misplaced debt payments, accountants also found that a $4 million payment was mistakenly recorded twice due to an initial bounced check. In other unaccounted-for transactions, amounts had been entered incorrectly in the books or deposited into the wrong account.

After these adjustments, along with a number of others, the total discrepancy in the city's books has been reduced from nearly $40 million down to $2.9 million. Jim Engler, Philadelphia Mayor Jim Kenney's Chief of Staff, has indicated that no city employees will be fired over the $40 million worth of accounting mistakes, which means that going forward, there will continue to be no penalty for city employees for mishandling the city's funds through sloppy bookkeeping.

Labels: ,

26 July 2018

Roughly every three months, we check in on the overall state of the new home market in the U.S. by estimating its market capitalization, where we take the average prices of new homes sold in the U.S. and multiply it by the number of new homes sold each month, then calculating its twelve month average to minimize the effects of seasonal variation in the monthly data. The following animated chart shows the results of that math, both in nominal terms and adjusted for inflation, for the period from December 1975 through the preliminary data just reported for June 2018. [If you're accessing this article on a site that republishes our RSS news feed, please click through to our site to see the animation.]

Animation: Trailing Twelve Month Average New Home Sales Market Capitalization, Not Adjusted for Inflation [Current U.S. Dollars] and Adjusted for Inflation [Constant June 2018 U.S. Dollars], December 1975 - June 2018

In nominal terms, the trailing twelve month average of the market capitalization of the U.S. new home market has been rising since our last report three months ago, but at a decelerating pace, where the preliminary data for June 2018 suggests that the trailing twelve month average of the new home market capitalization has stalled in June 2018, with the months of both May and June 2018 recording a market cap of $20.1 billion in new home sales. This data will be revised several times during the next several months, so that may not the final outcome once all the data is finalized, but it is safe to say that the rate of growth of the new home market in the U.S. has slowed.

Scanning the news, CoreLogic has reported that new home sales in California have dropped sharply in June 2018, deviating from the pattern in the rest of the U.S. during the peak summer home-selling season. The California Association of Realtors has indicated that sale prices in the state reached an all-time high.

Going back to the national data, the U.S. Census Bureau reports that median new home sale price for June 2018 continued a downward trend for recent months. The following chart shows the trailing twelve month average of median new home sale prices versus median household income, where the recent upward trend appears to have peaked in April 2018.

U.S. Median New Home Sale Prices vs Median Household Income, Annual: 1999 - 2016 | Monthly: December 2000 - June 2018

Although based on preliminary data, the recent decline of median new home sale prices has come while median household income has been increasing, where new homes have begun to back off of the peak they reached in February 2018 to become slightly more affordable in recent months.

Ratio of Trailing Twelve Month Averages for Median New Home Sale Prices and Median Household Income, Annual: 1967 to 2016 | Monthly: December 2000 to June 2018

The ratio of the trailing year averages of median new home sale prices to median household income is however still elevated, with median new homes selling for just under 5.4 times the typical income of an American household in June 2018.

References

U.S. Census Bureau. New Residential Sales Historical Data. Houses Sold. [Excel Spreadsheet]. Accessed 25 July 2018.

U.S. Census Bureau. New Residential Sales Historical Data. Median and Average Sale Price of Houses Sold. [Excel Spreadsheet]. Accessed 25 July 2018.

U.S. Department of Labor Bureau of Labor Statistics. Consumer Price Index, All Urban Consumers - (CPI-U), U.S. City Average, All Items, 1982-84=100 [Online Application]. Accessed 25 July 2018.

Sentier Research. Household Income Trends: June 2018. [PDF Document]. Accessed 25 July 2018. [Note: We've converted all data to be in terms of current (nominal) U.S. dollars.]


Labels:

25 July 2018

If you're one of those people who are intimidated by math the more advanced it gets, buckle in, because we're going to push the limits!

Let's get there in bite-size stages, starting with the set of "real" numbers, which are made up of the set of all the whole, rational and irrational numbers that you might find on the kind of number line that you might remember from your days in school, whether they be positive, negative, or the number zero.

Real Number Line - Source: https://math.tutorvista.com/number-system/real-number-line.html

Real numbers are the kinds of numbers that you should be pretty familiar with from daily life, where you'll find yourself adding, subtracting, multiplying and dividing them pretty often or, as might be the case, that the computer-based devices you use are dealing with.

But while they may seem like they're pretty complex, they're not what mathematicians would call complex numbers, which are one step up in the complexity level from real numbers. Complex numbers are made up of the set of all the real numbers that we've already covered, plus what are called "imaginary" numbers, which are what mathematicians call the numbers that become negative when you multiply them by themselves. Unless you're working in fields like physics or engineering, you're unlikely to run into these kinds of numbers often, if at all, but you should appreciate that they make it possible to describe or predict how things like electronic circuits work.

Example of Use of Complex Numbers in Electronic Circuit Applications - Source: Quora - https://www.quora.com/Does-the-realness-of-i-i-have-any-implications-in-the-world-of-physics-or-any-other-realms-that-utilize-complex-numbers

The next stage gets much more complex. Called quaternions, these represent an extension of complex numbers into a much more complex kind of number that happens to be really useful if you need to describe or predict how a body might move through space and time (or in simpler form, how stock prices change), which makes them four-dimensional in nature. (The following image shows a 2-D representation of the zeroth through fourth dimensions.)

Dimensional Levels 0 through 4 - Source: Wiki Commons - https://commons.wikimedia.org/wiki/File:Dimension_levels.svg

Like real and complex numbers, quaternions can be added, subtracted, multiplied and divided, but with the wrinkle that they are non-commutative, where it matters what order you follow in performing these common mathematical operations, because the commutative properties of multiplication and addition no longer apply. What that means is that if you have quaternion values A and B that you are multiplying together, for example, you'll get a different answer if you multiply them in the order A*B than you will if you multiply them in the order B*A.

Octonions turn the complexity dial all the way up to eight, absorbing all the quaternion numbers (which previously absorbed all the complex numbers, which previously absorbed all the real numbers), while adding four additional parameters into the mix. By definition, that makes them eight-dimensional quantities, but here, the associative property of multiplication stops working the way it does for real, complex and quaternion numbers, where now, the answers you get from the math is determined in part from how you group the quantities being multiplied together. That's on top of the commutative properties of multiplication and addition having already broken down at the quaternion level.

Which brings us up to why we're even talking about these kinds of numbers today, because thanks to recent work by physicist Cohl Furey, it turns out that they may go a long way to describing how all matter works, all the way down to the quantum level as currently quantified by physics' Standard Model.

Furey began seriously pursuing this possibility in grad school, when she learned that quaternions capture the way particles translate and rotate in 4-D space-time. She wondered about particles’ internal properties, like their charge. “I realized that the eight degrees of freedom of the octonions could correspond to one generation of particles: one neutrino, one electron, three up quarks and three down quarks,” she said — a bit of numerology that had raised eyebrows before....

In her most recent published paper, which appeared in May in The European Physical Journal C, she consolidated several findings to construct the full Standard Model symmetry group, SU(3) × SU(2) × U(1), for a single generation of particles, with the math producing the correct array of electric charges and other attributes for an electron, neutrino, three up quarks, three down quarks and their anti-particles. The math also suggests a reason why electric charge is quantized in discrete units — essentially, because whole numbers are.

However, in that model’s way of arranging particles, it’s unclear how to naturally extend the model to cover the full three particle generations that exist in nature. But in another new paper that’s now circulating among experts and under review by Physical Letters B, Furey uses C⊗O to construct the Standard Model’s two unbroken symmetries, SU(3) and U(1). (In nature, SU(2) × U(1) is broken down into U(1) by the Higgs mechanism, a process that imbues particles with mass.) In this case, the symmetries act on all three particle generations and also allow for the existence of particles called sterile neutrinos — candidates for dark matter that physicists are actively searching for now. “The three-generation model only has SU(3) × U(1), so it’s more rudimentary,” Furey told me, pen poised at a whiteboard. “The question is, is there an obvious way to go from the one-generation picture to the three-generation picture? I think there is.”

This could become a very interesting space to watch! Particularly if it succeeds in re-directing physicists away from expensive experiments that involve colliding subatomic particles together at near-light speed to try to generate new ones that physics' Standard Model suggests may not exist.

And then, if the octonion-based math could resolve the issue of quantum gravity, that would be something too!

Until then, we'll leave you to consider the nature of a four dimensional cube that has been animated in 2D, because it's a lot easier to grasp than octonions....

Labels:

24 July 2018

We're going to cover more than one dividend related topic in this post, so if your time is limited, here's the short and sweet summary:

  1. According to quarterly dividend futures, dividends are set to grow in 2018-Q3 but be flat in 2018-Q4.
  2. According to annual dividend futures, quarterly dividends are wrong about where the total for the year is going to end up.
  3. The total of 42 dividend cuts recorded by S&P for the total U.S. stock market in June 2018 is almost certainly wrong.

Let's drill down into the details in order, starting with our chart showing actual quarterly dividends for the S&P 500 from the first quarter of 2017 through the CME Group's quarterly dividend futures forecast for both 2018-Q3 and 2018-Q4.

Actual and Forecast S&P 500 Quarterly Dividends per Share, 2017-Q1 through 2018-Q2, with Forecast for 2018-Q3 and 2018-Q4

The way the quarterly dividend futures are tracking, 2018-Q3 is looking as if it will see a significant boost over 2018-Q2. However, the dividend futures are also predicting that the growth of quarterly dividends per share will stall out in 2018-Q4.

But this forecast is almost certainly wrong, which we can confirm with the CME Group's more actively traded annual dividend futures for the S&P 500. To illustrate what we mean, we've calculated the rolling 12-month total of quarterly dividends per share from 2017-Q1 through the forecast for 2018-Q4, showing what the annual dividends per share for 2018 will be if the forecast quarterly dividends per share were on target, where we'll compare that last value with what the annual dividend futures indicate the total dividend payout for the S&P 500 will be in 2018.

Actual and Forecast S&P 500 Annual (Rolling 12-Month) Dividends per Share, 2017-Q1 through 2018-Q2, with Forecast for 2018-Q3 and 2018-Q4

The difference between the rolling 12-month total of the forecast quarterly dividend futures through the end of 2018 and the annual dividend futures forecast for the total dividend payout for the S&P 500 in 2018 means that the forecast quarterly dividends for 2018-Q3 and 2018-Q4 are almost certainly incorrect at this point. It's not quite the same "money on the sidewalk" situation we commented upon earlier this year, but it's something that might pique the interest of options traders.

Finally, several weeks ago, we couldn't help but notice the large discrepancy between the number of dividend cuts reported by Standard and Poor for the total U.S. stock market in June 2018 and our near-real time tally of 9 cuts declared during the month, which motivated us to dig deeper into the numbers. Here's the chart we featured in our earlier post for reference.

Number of Public U.S. Companies Increasing or Decreasing Dividends in Each Month from January 2004 through June 2018

We contacted S&P's Howard Silverblatt, who was kind enough to provide a copy of the dividend cuts that were captured by S&P's D1 database, which contained information for 41 such actions. After parsing through the data, we found that S&P's D1 system had captured a lot of noise in the form of companies that had declared dividend cuts in previous months, but which had gone ex-dividend in June 2018.

There was also some additional noise related to a couple of firms' stocks that had gone through splits, where no real dividend cuts had occurred, although it might appear that way to an automated system. And then, there were a few dividend declarations that had been incorrectly captured in the screen for dividend cuts, where the firms had either increased their dividend or left them unchanged.

After all the dust cleared, we counted a total of 7 dividend cuts declared in June 2018, 5 of which matched firms we had previously captured in June 2018. Assuming that we successfully captured all the other dividend cuts declared in June 2018, that would put the total for the month at 11 firms (where we would add Aceto (NASDAQ: ACET) and Territorial Bancorp (NASDAQ: TBNK) to our tally as the two in S&P's list that previously matched in ours.

What that means is that S&P's automated system doesn't perfectly capture all the dividend cuts that might be declared in a given month, where it appears to have been more glitchy in June 2018 than in previous months. The good news is that June 2018's estimated 11 actual declared dividend cuts would make it one of the best months in years for the U.S. stock market as measured by the relative absence of dividend cuts announced during it.

Labels:

23 July 2018

Earnings season for 2018-Q3 is off to a positive start, and with the S&P 500 now just 2% away from setting new records, investors have shifted a portion of their forward-looking attention away from 2019-Q1 to focus on the current quarter.

At least, that's one interpretation that explains what we're seeing in the alternative futures spaghetti forecast chart for the S&P 500.

Alternative Futures - S&P 500 - 2018Q3 - Standard Model - Snapshot on 20 Jul 2018

Assuming that it is 2018-Q3 that they're focusing upon, it looks like investors are placing about 25% of their attention on 2018-Q3 and about 75% on the more distant future quarter of 2019-Q1 in setting today's stock prices.

The main question for the market going forward is to what extent might the ongoing tit-for-tat tariff wars negatively affect the business outlooks of traditional S&P 500 firms and to what extent might the updated outlooks of the index' biggest tech firms offset that impact.

So the upcoming fourth week of July 2018 could be a week with a lot of market-moving news. Unlike last week, which was pretty uneventful where moving the market was concerned....

Monday, 16 July 2018
Tuesday, 17 July 2018
Wednesday, 18 July 2018
Thursday, 19 July 2018
Friday, 20 July 2018

For more background, Barry Ritholtz has summarized the top six positives and negatives for markets and economics news in Week 3 of July 2018.

Labels: ,

20 July 2018

We missed it when it first came out, but nearly 1.2 million views later, if solving quadratic equations is still burning your waffles, you might want to enroll in the rap-to-'rithmetic school of learning....

It could be worse. You might be compelled to get triggy with it.

It's not your imagination. Hiphop sold out and went mainstream a long time ago.

Others in the series:

Labels:

19 July 2018

The Pennsylvania Supreme Court ended months of potential legal jeopardy for Philadelphia's controversial sweetened beverage tax on Wednesday, 18 July 2018, when it ruled that the tax would be considered to be lawful under Pennsylvania's state constitution.

In a 4-2 majority opinion, the court found that the city had not violated state law by taxing the distribution of beverages. Opponents of the tax had argued that the levy amounted to double taxation because it is passed down to consumers who already pay sales taxes.

The ruling provided a more secure future for the 1.5-cents-per-ounce tax that funds Pre-K, community schools and improvements to parks, libraries, and recreation centers and ended a case that had been closely watched by other cities since Philadelphia became the first big U.S. city to pass a tax on soda and sweetened beverages in 2016.

The state high court's decision largely hinged on whether the well established economics principle of tax incidence would be recognized within the state of Pennsylvania, which is evident from one of the dissenting minority opinions for the case.

The opinion affirmed a ruling issued last year by the state’s Commonwealth Court. Justices Max Baer, Debra Todd, and Christine Donohue joined Saylor in the majority. Justice David N. Wecht and Justice Sallie Updyke Mundy each wrote dissenting opinions. Justice Kevin M. Dougherty, who is from Philadelphia, did not participate in the case.

Wecht, in his dissent, called the idea that the city’s tax is levied on beverage distribution “convenient fiction,” writing instead that taxing distribution of beverages specifically intended for retail sales is an illegal duplication of the state sales tax.

“A rose by any other name smells just as sweet, and, whether styled a retail tax or a distribution tax,” he wrote, and the tax at issue ends up “burdening the proceeds from the retail sale of sugar-sweetened beverages.”

A nearly identical tax in Chicago was repeatedly blocked under Illinois' state law and federal law by that state's judges who recognized the tax incidence concept back in 2017, forcing that city's legislators to considerably revise it before being allowed to go into effect. That soda tax subsequently proved to be so unpopular that it was repealed after two months of being in effect.

Philadelphia's beverage tax has likewise been unpopular, where it has underperformed both city officials' original expectations and their subsequently lowered expectations.

Desired vs Actual Estimates of Philadelphia's Monthly Soda Tax Collections, January 2017 through April 2018)

It definitely hasn't been the kind of reliable source of tax revenue for funding things like park improvements, community schools or "free" pre-Kindergarten/daycare programs that Philadelphia residents might have hoped to have. Then again, Philadelphia city officials might want to rethink their plans for instituting a city-wide pre-K program, given the surprisingly negative results for a large, randomized control trial that were just recently reported.

Not to mention the apparently massive problems that Philadelphia city officials have with responsibly handling taxpayer money, but that's a different story for another day!

Labels: ,

18 July 2018

How well are typical American households faring through this point of time in 2018?

To answer that question, we're going to turn to a unique measure of the well-being of a nation's people called the "national dividend". The national dividend is an alternative way to measure of the economic well being of a nation's people that is primarily based upon the value of the things that they choose to consume in their households, which makes it very different and, by some accounts, a much more effective measure of their state than the more common measures that focus upon income or expenditures that apply at all levels of the economy, such as GDP, which have proved themselves to not be well suited for the task of assessing the economic welfare of the people themselves.

To get around that problem, we've developed the national dividend concept that was originally conceived by Irving Fisher back in 1906, but which fell by the wayside in the years that followed because of the difficulty in collecting the household consumption data needed to make the analysis possible. It wasn't until we got involved in 2015 that anybody thought it might even be possible to develop with the kind of data that has become available in recent decades, where we've been working to make the national dividend measure of economic well-being into a reality.

With that introduction now out of the way, let's update the U.S.' national dividend through the end of May 2018 following our previous snapshot that was taken for data available through September 2017.

Monthly National Dividend, January 2000 through May 2018

Through May 2018, the national dividend continues to indicate that America's households are enjoying a robust improvement in their overall economic welfare, having grown from a preliminary estimate of $7.63 trillion (or a revised $7.78 trillion) in September 2017 to a new high of $8.13 trillion through May 2018.

Notes

This is the first update that we've had since December 2017, where we've largely kept this analytical series on the back burner following Sentier Research's suspension of its monthly household income estimates data series from May 2017 to March 2018. With Sentier Research having resumed presenting their data series, we hope to present the national dividend on a more frequent basis.

We've incorporated the U.S. Census Bureau's April 2018 update to the estimated number of households in the U.S. in today's analysis, which covers data through March 2018. We're using projections of the number of households for April and May 2018, which we've based on recent historic trends, where we will subsequently revise this data as Census estimates become available.

Postscript

If you look closer at the history of our chart, you'll see the very real impact to American households of the most inappropriately contractionary monetary policy that the Federal Reserve ever implemented outside of the Great Depression. If you want to find out more about that event, check out this very timely article at Forbes....

Previously on Political Calculations

The following posts will take you through our work in developing Irving Fisher's national dividend concept into an alternative method for assessing the relative economic well being of American households.

References

Chand, Smriti. National Income: Definition, Concepts and Methods of Measuring National Income. [Online Article]. Accessed 14 March 2015.

Kennedy, M. Maria John. Macroeconomic Theory. [Online Text]. 2011. Accessed 15 March 2015.

Political Calculations. Modeling U.S. Households Since 1900. 8 February 2013.

U.S. Bureau of Labor Statistics. Consumer Expenditure Survey. Total Average Annual Expenditures. 1984-2015. [Online Database]. Accessed 7 February 2017.

U.S. Bureau of Labor Statistics. Consumer Price Index - All Urban Consumers (CPI-U), All Items, All Cities, Non-Seasonally Adjusted. CPI Detailed Report Tables. Table 24. [Online Database]. Accessed 16 July 2018.

Labels:

17 July 2018

Baseball's All Star Game (also known as the "Midsummer Classic", although nobody outside of the evil influence of Major League Baseball's marketing staff calls it that) is upon us once again, and once again, thanks to many more years of uninteresting execution, we have to recognize that it's kind of a disaster.

Perhaps not as big a disaster as the National Football League's Pro Bowl or the National Basketball League's All-Star Game, but still, for a game that calls itself the "national pastime", fans deserve better.

We have an idea for how to fix baseball's All Star Game that was ahead of its time back in 2009 when we first proposed it, but may now be more relevant given the recent experience of major league baseball players. Let's recap....


Once upon a time, baseball's All Star Game was a competitive event. Joe Sheehan describes the "good old days":

Baseball’s All-Star Game was once a cutthroat battle between two distinct and competitive entities, one of just two times all season that the leagues interacted. The game was played largely by the very best players in baseball, and those players often went the distance. If you wanted to see Babe Ruth face Carl Hubbell, or Bob Feller take on Stan Musial, or Warren Spahn pitch to Ted Williams, the All-Star Game was just about your only hope.

In the modern era, the All-Star Game has been reduced to the final act of a three-day festival, in recent seasons often overshadowed by the previous night’s Home Run Derby. Rather than a grudge match between rivals, it’s an interconference game like the NFL, NBA, and NHL events. The individual matchups, once unique, have been diluted by interleague play.

But that's not the worse part. Sheehan goes on to describe how players and managers used to approach the game, comparing the past to how things are done today:

In the first All-Star Game back in 1933, the starting lineups went the distance. The AL made just one position-player substitition, getting legs in for Babe Ruth late in the game. In the NL, the top six hitters in the lineup took all their at-bats. Each team used three pitchers. A quarter-century later, this was still the general idea: seven of the eight NL position-player starters in the 1958 game went the distance, five AL hitters did, and the teams used just four pitchers each. The best players in baseball showed up trying to win to prove their league’s superiority.

Then it all went awry. Before interleague play or 32-man rosters or All-Star Monday, there were the years of two All-Star Games. From 1959 through 1962, the AL and NL met twice each summer as a means of raising revenues for the players’ fledgling pension fund. In '58, 32 players played in the All-Star Game, 12 of them staying for the entire game. In 1963, the first year after the experiment, 41 players played and just five went the distance. The 1979 game, one of the all-time best contests, saw 49 players used and had just three starters who were around at the end. Fast-forward to 2007—the last nine-inning All-Star Game—and you find 55 players in, 17 pitchers used to get 54 outs, and not a single starter left in the game at its conclusion.

The All-Star Game has lost its luster because the game isn’t taken seriously by the people in uniform. Don’t read what they say—watch what they do. That’s the damning evidence that the participants care less about winning than they do about showing up.

Baseball! So what to do? In recent years, Major League Baseball has tried to incentivize the players and managers of the teams representing the National League and American League by awarding home field advantage in the World Series to the team from the winning league in the All Star Game. But is that really working?

Given that the managers of each league's all-stars are the managers of the teams that went to the previous year's World Series, who most often by the All-Star Game know that their own teams are unlikely to repeat, what's the point? If they're not in the running by that point of the season, why manage their All-Star team to win for an advantage they themselves won't realize? The same fact holds especially true for the players, with maybe as many as a half-dozen on each side playing for teams with a realistic shot at making it to the World Series in the fall.

Bob Ryan describes how history has played out since league home-field advantage has been made the purpose of the All-Star Game:

Sadly, Bud Selig and his marketing minions don’t get it.

Bud is haunted by the tie game in his own hometown seven years ago. It was second only to the cancellation of the 1994 World Series as the worst event of his tenure as baseball commissioner, and he is determined it will not happen again. He thinks the solution is to expand the rosters for Tuesday night’s game into the absurdly bloated 33 players apiece, 13 of whom will be pitchers. As is almost invariably the case in these matters, more is less.

In order to restore so-called “meaning’’ into the game, Bud declared that, beginning with the 2003 game, the teams would be playing for home-field advantage in the World Series. That hasn’t prevented the American League from winning the first six games played under this policy, nor has it prevented the National League from winning the World Series despite lack of said home-field advantage in 2006 and 2008.

But neither of those are the point. The truth is that Bud wants it both ways. He wants the game to be played for a proper prize, and yet he has allowed the game to evolve into something far less than a real baseball contest.

Ryan argues that the way to make the All-Star Game more meaningful would be to shrink each team's roster back to 25 players and tell the managers to focus on winning, but still, they're having to go out of their way to play a game that offers nothing of real value to either the players or the managers.

If you're a player, why risk injury by playing your heart out? If you're a manager, why not coach by the same rules that apply to T-ball, where everybody getting a turn is more important than winning the game?

Here's our idea: put a real world championship at stake in the All-Star Game. Create two potential teams to represent the United States in the World Baseball Classic, Olympics or other international competition that might apply, one from the National League, the other from the American League. Players for the team that wins the All-Star Game would then be the ones that would literally get to go for the gold for the next year.

The Positions! That might create a problem in selecting players, given the increasing internationalization of baseball in the United States. Here, at least 25 of the players selected for each team, covering each needed position for international play, would have to be selected to satisfy the eligibility rules to represent the U.S. Additional players, who might not be eligible to represent the United States, could then fill out the remaining slots on each team's roster to bring the total up to 33.

With the opportunity to coach the team in international play, that might make for a real challenge for managers, who would be compelled to put the best players possible out on the field in the All-Star Game, regardless of eligibility, who might then have to face those same additional players on other teams in international play.

And wouldn't that, just by itself, be a lot more interesting than how the All-Star Game is played today.


What's different in 2018 that wasn't the case in 2009 is the positive experience that major league baseball players had in the 2017 World Baseball Classic, which for many, was the first time that they were encouraged to have fun on the field since they played Little League. Baseball's attempted expansion into World Cup-style international play may provide the incentive the sport needs to transform its All-Star Game into something more meaningful.

Labels:

About Political Calculations

Welcome to the blogosphere's toolchest! Here, unlike other blogs dedicated to analyzing current events, we create easy-to-use, simple tools to do the math related to them so you can get in on the action too! If you would like to learn more about these tools, or if you would like to contribute ideas to develop for this blog, please e-mail us at:

ironman at politicalcalculations

Thanks in advance!

Recent Posts

Indices, Futures, and Bonds

Closing values for previous trading day.

Most Popular Posts
Quick Index

Site Data

This site is primarily powered by:

This page is powered by Blogger. Isn't yours?

CSS Validation

Valid CSS!

RSS Site Feed

AddThis Feed Button

JavaScript

The tools on this site are built using JavaScript. If you would like to learn more, one of the best free resources on the web is available at W3Schools.com.

Other Cool Resources

Blog Roll

Market Links

Useful Election Data
Charities We Support
Shopping Guides
Recommended Reading
Recently Shopped

Seeking Alpha Certified

Archives