Elasticity of Substitution – Louth Online http://louthonline.com/ Sat, 19 Nov 2022 09:58:12 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.3 http://louthonline.com/wp-content/uploads/2021/03/louthonline-icon-70x70.png Elasticity of Substitution – Louth Online http://louthonline.com/ 32 32 Global rubber market to surpass $65.73 billion valuation http://louthonline.com/global-rubber-market-to-surpass-65-73-billion-valuation/ Thu, 17 Nov 2022 12:36:03 +0000 http://louthonline.com/global-rubber-market-to-surpass-65-73-billion-valuation/ Westford, USA, Nov. 17, 2022 (GLOBE NEWSWIRE) — Global rubber demand will stabilize over the medium to long term as the macroeconomic environment improves, according to a SkyQuest report. Global natural rubber market demand is forecast to grow at a CAGR of over 1.5%, reaching 18 million metric tons by 2028. However, this growth rate is lower than that seen from 2010 to 2018 due to technological advances in alternative materials and the substitution of some grades of rubber for others. Growth in the global rubber market is held back by price volatility and the substitution of rubber-2 (natural) for certain grades of synthetic rubber. However, over time we are seeing renewed interest in synthetic rubbers as usage increases for lubricating elements in machinery and automotive parts. Demand for natural rubber is expected to remain strong due to its many applications such as roofing membranes and automotive gaskets. Alternative materials such as silicone nitrile are also appreciated as they have better weather resistance and temperature tolerance characteristics than synthetic rubbers. Get a sample copy of this report: https://skyquestt.com/sample-request/rubber-market Major factors driving the synthetic rubber market growth include increasing usage in automotive and consumer product applications, as well as increasing R&D investments in new technologies such as biopolymers. The increase in the use of renewable energy is also expected to drive the demand for environmentally friendly synthetic rubber products. Challenges Tire manufacturers are facing a number of challenges in the rubber market, such as lack of raw materials and rising costs. These challenges are likely to impact the tire industry globally and therefore companies operating in this market must be prepared to deal with them. Lack of raw materials is a major challenge for tire manufacturers as it has driven up rubber stock prices. Global production in the rubber market exceeded 27.4 million metric tons in 2021. However, growing demand from the automotive and construction industries will put pressure on the availability of rubber supplies. This situation is expected to continue until at least 2027 due to increased investment in synthetic alternatives and the expansion of natural resources such as Brazil and Indonesia. Thailand produces around 35% of the world’s natural rubber. On the other hand, China is the largest producer of synthetic rubber. Cheaper synthetic substitutes have emerged in recent years as an option for tire manufacturers looking for ways to cut costs. However, these products are not always reliable and may eventually wear out faster than natural rubber products. Additionally, artificial rubbers often require different manufacturing processes which can add cost and time to the overall process. As a result, OEMs in the global rubber market are increasingly turning to natural rubbers as an option for next-generation tires due to their durability and low emission levels. Rising labor costs are also a major challenge for tire manufacturers. According to Forrester Research’s Global Manufacturing Competitiveness Report (GMC), labor costs have risen faster than product costs or non-labour input prices over the past five years in most of the countries studied (including China). Synthetic Rubber to Generate$45 Billion in Global Rubber Market Revenue

The synthetic rubber market is expected to generate over USD 45 billion in revenue by 2028. In fact, the market is expected to grow at a CAGR of over 6% during the forecast period. Key drivers of this growth include growing demand from the automotive and construction industries, as well as growing demand for rubber products from downstream industries such as adhesives and sealants. The growth in the production of synthetic rubbers is attributed to the growing demand for fuel efficient and environmentally friendly vehicles.

https://skyquestt.com/report/rubber-market

Although natural rubber has some advantages such as high elasticity, low coefficient of friction and fatigue resistance, it has several disadvantages such as shortages and environmental concerns. In recent years, sensors and controllers in the automotive rubber market have been made with plastics containing recycled materials such as polypropylene and acrylics (synthetic rubbers). This has led to an increase in demand for these types of plastics, which are mostly made up of water and oil. As a result, the production of synthetic rubbers is expected to grow at a CAGR of 6% during the period 2022-2028.

Some of the prominent players in the synthetic rubber market include BASF SE (Germany), ChemChina Co Ltd (China), DowDuPont Inc (US), Exxon Mobil Corporation (US), Genuine Corp (Japan), Interscience Associates LLC (USA), LG Chem Ltd (South Korea), Mitsubishi Tanabe Plastics Group, Ltd. (Japan), Nexans NV (Belgium/France) and PTT Shinawatra Plastics Industry.

Despite this growth, the industry faces several challenges. These include environmental concerns related to manufacturing polymers from petroleum, rising raw material costs and rising labor costs. Additionally, new materials such as carbon fiber composites and nanocomposites are competing with synthetic rubbers in some markets. SkyQuest believes these challenges can be overcome through innovation and strong execution.

Thailand, Indonesia, China and South Korea produce over 43% of rubber market revenue

World rubber production is set at 27.3 million tonnes. Of which 35% of natural rubber is produced by Thailand, followed by Indonesia. On the other hand, South Korea accounts for more than 12% of natural rubber production. Our study suggests that more than 90% of the rubber produced in Thailand and Indonesia is exported to various countries around the world.

Thailand is the world’s largest producer of natural rubber in the global rubber market with an output of 4.3 million tonnes in 2021. Indonesia is third with an output of 11,000 metric tonnes while China ranks fourth with a production of 3.09 million tons. The main production areas for Thailand are central and southern Thailand, while for Indonesia it is mainly Sulawesi and Java. For China, it is mainly Hebei province while for South Korea, it is mainly Jeolla province. The main production areas in each country vary according to climatic and soil conditions.

In China, crude rubber oil is extracted from bubbling raw materials including latex and other impurities left after refining crude oil products like gasoline or diesel fuel. China is the world’s largest producer of synthetic rubber and manufactures about two-thirds of the world’s rubber products. In 2021, the country produced 6.3 million tons of synthetic rubber and its production capacity is growing at a CAGR of over 9.7%. The strong demand for synthetic rubber reflects growing concerns about the reliability and sustainability of natural rubber resources.

Speak to the analyst for your custom needs:

https://skyquestt.com/speak-with-analyst/rubber-market

Key players in the global rubber market

• ARLANXEO (Netherlands)
• Kumho Petrochemical Co., Ltd. (South Korea)
• PetroChina (China)
• TSRC Corporation (Taiwan)
• LG Chem (South Korea)
• Versalis (Italy)
• Dow Chemicals (USA)
• Von Bundit (Thailand)
• Agribusiness in Sri Trang (Thailand)
• Southland Holding Lonza (Thailand)
• Vietnam Rubber Group (Vietnam)
• Tong Thai Rubber Group (Thailand)
• Ravasco (India)
• Halcyon Agri (Singapore)

Related Reports in the SkyQuest Library:

Global Dash Cam Market

Global gas engine market

Global automotive lithium-ion battery market

Global Automotive Biometrics Market

Global Automotive Terminal Authentication Market

SkyQuest Technology is a leading growth consulting firm providing market intelligence, marketing and technology services. It has more than 450 satisfied customers worldwide.

1 Apache Way, Westford, Massachusetts 01886

Call:

USA (+1) 617-230-0741

E-mail: sales@skyquestt.com

]]> Why Budget 2023 Should Increase Tobacco Taxes – The Island http://louthonline.com/why-budget-2023-should-increase-tobacco-taxes-the-island/ Thu, 10 Nov 2022 23:43:44 +0000 http://louthonline.com/why-budget-2023-should-increase-tobacco-taxes-the-island/

Amid widespread economic uncertainty during 2022, Sampath Bank maintained a strong capital base and a stable liquidity profile. Proactive efforts to identify challenges and implement appropriate strategies have enabled the Bank to further strengthen its soundness and stability. The Bank also continued to lead by example in demonstrating its commitment to the national growth agenda by promoting inward remittances and encouraging the influx of export earnings to the country while helping all stakeholders manage the current economic crisis. CSR activities have also been accelerated by undertaking multiple projects under the Bank’s flagship “Weweta Jeewayak” program to boost the rural economy.

The Bank declared a PAT of Rs 7.2 billion and a PBT of Rs 9.3 billion for the period ended 30 September 2022, reflecting a decline of 19.8% and 24.4% respectively, compared to the figures declared. for the corresponding period in 2021, which reflects the current economic crisis in the country. As of September 30, 2022, the Group declared PAT and PBT of Rs 7.7 billion and Rs 10.2 billion, a decrease of 21.6% and 24.3% respectively compared to the corresponding period of 2021.

Key Financial Highlights Reported by Sampath Bank for 2022:

276% growth in FX revenue resulting from the sharp depreciation of the LKR against the US Dollar by 82% or Rs 164.75. Considerable increase of 69.5% in net fee and commission income during the period , driven by cards and trade-related operations.

The Bank recorded an impairment charge of Rs 48.8 billion on loans and investments to account for possible economic uncertainties during the year.

Fund-based income

Total interest income increased by 67.7% year-on-year during the nine months ended September 30, 2022, reaching Rs 106 billion against Rs 63 billion reported during the corresponding period of the previous year. This is mainly explained by the increase in interest rates recorded in 2022, which saw the AWPLR reach 25.95% as of September 30, 2022, an increase of 1,953 basis points compared to September 30, 2021 and a increase of 1,734 basis points compared to the end of 2021. The one-year Treasury bill coupon rate also increased to 29.85% as of September 30, 2022, an increase of 2,284 basis points. basis compared to September 30, 2021.

Driven by rising market interest rates, the Bank’s interest expense increased by 57.3% from the same period last year to Rs 52.8 billion for the reporting period . Prudent asset and liability management resulted in net interest income increasing 79.4%

Non-Fund Based Income

During the reporting period, the Bank’s net income from fees and commissions (NFCI) increased significantly by 69.5% compared to the same period of the previous year. NFCIs, which include revenue from many sources, such as loans and advances, credit cards, commerce and electronic channels, grew significantly thanks to card-related activities and fee and commission revenue from trade-related activities.

Other net operating income for the nine months ended September 30, 2022 was Rs 18 billion. This 320% year-on-year increase was attributed to the Rs 164.75 decline in the value of the LKR against the USD. In 2022, the Bank recorded a net trading loss of Rs 3 billion against the loss of Rs 98 million recorded the previous year. Total foreign exchange revenue for the reporting period was Rs 14.5 billion, up from Rs 3.8 billion recorded last year.

Depreciation charges

The Bank recognized a total impairment charge of Rs 48.8 billion for the nine months ended 30 September 2022. This is an increase of 396% from the charge of Rs 9.8 billion reported on last year. Of this amount, the impairment charge for loans and advances amounted to Rs 37.7 billion, while Rs 10.3 billion related to other financial instruments. In addition, an impairment charge totaling Rs 839 Mn has been recorded against other commitments and contingencies.

Impairment charge on loans and advances: In order to reflect the deterioration of the country’s economic environment, the Bank increased the probability weighting assigned to a worst-case economic scenario and revisited the EFA model which led to the recognition of a provision for impairment significantly higher during the reporting period. Industries considered high risk have been expanded to capture a wider range of industry-specific stressors. The potential impact of higher inflation, higher interest charges and increased taxes on the retail segment are some of the other factors that have been considered in the recognition of provisions for depreciation.

The Bank has reviewed the adequacy of the impairment allowance with respect to tourism customers and other similar impacted industries where necessary and adequate impairment allowances have been recorded for individually significant loan impairments. The Bank also continued to recognize a provision for impairment of customers exiting the moratorium at the end of December 2021 and June 2022, as some customers requested additional concessions given the current economic outlook. Additionally, steps have been taken to transition customers from Stage 1 to Stage 2 based on their ability to withstand the negative effects caused by the economic downturn.

The culmination of these efforts has resulted in higher overall provision coverage of 9.8% as of September 30, 2022, which is considered sufficient to help the Bank absorb potential losses resulting from difficult macroeconomic conditions.

Impairment charge on other financial instruments: The Bank provided Rs 9,040 million against SLISBs and Rs 935 million against SLDBs as of September 30, 2022. This decision was influenced by two key factors: rating downgrade Sri Lanka sovereign in May 2022 to RD of C by Fitch Ratings and the current debt restructuring measures taken by the government. The Bank’s accumulated impairment provision for SLDBs and SLISBs stood at Rs 21.6 billion at the end of the reporting period. In the meantime, the Bank was able to significantly reduce exposure to FCY ​​instruments by converting matured SLDBs to LKR instruments during the reporting period.

Net operating income

Total operating income for the period increased by Rs 40 billion. However, the impairment charge also increased by Rs 39 billion, limiting the growth in net operating income to 3.7%.

Operating Expenses

Operating expenses during the reporting period amounted to Rs 20.5 billion, an increase of 23.6% from the Rs 16.6 billion recorded during the corresponding period last year. Rising inflation and the depreciation of the LKR were the main contributors to this increase. Despite the growth recorded in operating expenses, the Bank’s cost to income ratio (CIR) fell significantly by 1,460 basis points and stood at 25% against 39.6% recorded in the corresponding period. of 2021. This decrease in CIR was mainly due to the increase in total operating income exceeding the increase in total operating costs.

Tax expenditures

Despite the 17.6% drop in profit before VAT, VAT on Financial Services increased by 9.3% due to the upward movement of the VAT rate from 15% to 18%, with effect from January 1, 2022 .

The Inland Revenue (Amendment) Bill published on October 11, 2022 has not been substantially enacted by parliament. Therefore, the Bank did not take into account the changes proposed in the draft law for the reference period.

Main ratios

Return on average equity (after tax) fell to 8.08% as of September 30, 2022, compared to 11.05% at the end of 2021. Return on average assets (pre-tax) decreased set at 0.96% as of September 30, 2022 versus 1.44% reported for 2021.

Capital ratios

As of September 30, 2022, the Bank maintained all of its capital ratios well above regulatory minimum requirements. The Bank’s CET 1, Tier 1 and total capital ratios as of September 30, 2022 were 11.31%, 11.31% and 13.72% respectively, compared to 13.95%, 13.95% and 17.02 % at the end of 2021. The decline in the ratios during the reporting period is due to the combined impact of the increase in risk-weighted assets resulting from the depreciation of the LKR, cash dividends and the payment of the surcharge.

Assets and liabilities

Sampath Bank’s total assets exceeded Rs 1.3 Tn at the end of September 2022, an increase of Rs 113 bn (annualized growth of 12.6%) compared to the position at December 31, 2021 of Rs 1.2 Tn. The increase in cash and cash equivalents and net loans and advances contributed to the aforementioned growth. One of the main causes of balance sheet expansion can be attributed to the devaluation of the local currency during the year.

Total advances increased by 22.6% (annualized) during the reporting period from Rs 813 billion at the end of December 2021 to Rs 951 billion as at September 30, 2022. The LKR loan portfolio increased by 12.1% (annualized). It is worth mentioning that the value of foreign currency denominated loans increased significantly after the LKR depreciated by Rs 164.75 against USD during the period. If exchange rate variations had not occurred, total loans and advances would have increased by 8.8% (annualized). During 3Q22, the deposit base in LKR increased by Rs 44.4 billion due to deposit mobilization initiatives promoted by the Bank. Nevertheless, the growth of the LKR deposit base was limited to 0.8% compared to the end of 2021.

]]>
[Mission 2023] INSIGHTS DAILY CURRENT AFFAIRS + BIP CONTENTS November 05, 2022 http://louthonline.com/mission-2023-insights-daily-current-affairs-bip-contents-november-05-2022/ Sat, 05 Nov 2022 12:24:03 +0000 http://louthonline.com/mission-2023-insights-daily-current-affairs-bip-contents-november-05-2022/

#### GS Paper 3

Program: Environmental conservation, pollution and degradation, environmental impact assessment

Source: Indian Express, Indian Express, DTE, Indian Express

The context: Pollution of Delhi: From the end of October, meteorological factors and the burning of “stubble” are added to the already high pollution base in the Indo-Gangetic basin, in particular pollution due to fine particles (PM), to mist and smoke.

Particulate matter (PM) consists of solid particles and liquid droplets in the air. Any type of combustion activity or dust generation is a source of PM, for example, emissions (from vehicles and chimneys of industrial facilities)

Special case-PM2.5 (diameter of 2.5 micrometers or less) and PM10— far exceeds national and World Health Organization limits and is believed to be the main culprit for the heavy pollution of Delhi and its surrounding areas known as NCR.

Reasons why Delhi NCR region faces extreme particulate pollution:

Geographical reasons:

• Location of Delhi: It lies northeast of the Thar Desert, northwest of the Central Plains and southwest of the Himalayas. When the winds arrive from the coasts, bringing with them pollutants picked up along the way, they are “trapped” just before the Himalayas.
• cold temperature during the winter: During the summer, warmer air rises higher above the surface and carries pollutants with it. However, in October-November the air is not so hot. Pollutants are trapped and tend to concentrate at lower levels of the atmosphere, resulting in a smoke and haze situation.
• Lack of wind esp. after the end of the monsoon: The average winter wind speed in the Delhi NCR region is one third of the summer months. This makes the concentration of pollutants in the area.
• Dust storm: -According SAFAR (System of Air Quality and Weather Forecasting And Research), 40% of the particulate pollution in Delhi on those days could come from a “multi-day dust storm” that originated in the Middle East.

Anthropogenic factors:

• stubble burning: The root cause of stubble burning dates back to the 1960s-70s, when India introduced several measures as part of its green revolution to feed its growing population.
• Government policy:In an attempt to address the growing water crisis, the governments of Punjab and Haryana introduced laws which delayed the cultivation of Kharif and thus aggravated pollution from stubble burning.
• Manufacturing, power generation, construction and transportation: The Central Pollution Control Board (CPCB) and the National Environmental Engineering Research Institute (NEERI) have said that vehicle emissions are one of the main contributors to the increase in air pollution in Delhi.
• Minimum citizen participation: Unlike other parts of the world, there are few citizen movements to fight against pollution.
• Bad regulation: Regulation is most often seen as imposing bans, not as a stranglehold and as persuading industry – mostly small factories – to adopt environmentally friendly measures
• India has not recognized in politics and the law that air pollution is a killer.

Impact

• On Adult: The Lancet report that said 12.5% ​​of India’s deaths were due to air pollution
• On children: More than 116,000 infants in India died within a month of their birth in 2019 due to air pollution – both outdoor and indoor – according to the State of Global Air 2020 report.
• On the mother: Studies say that due to the pregnant mother’s exposure to very high levels of pollution, actually affects the placenta and fetus.
• On education: Hours lost due to school closures. For example, severe air pollution in Delhi led to the closure of the primary school.
• In economy: Closure of industries/factories. Limits on construction activity, etc.

Measures taken by the government

• Graduated Response Action Plan (GRAP): Pursuant to the Supreme Court’s order on C. Mehta vs Union of India (2016)regarding the air quality in the National Capital Region of Delhi, a graduated response action plan has been prepared to be implemented in different Air Quality Index (AQI) categories, namely moderate and mediocre, very mediocre and serious.
• National Air Quality Program (NCAP)- It aims to reduce the concentration of coarse (PM10) and fine particles (PM2.5) in the atmosphere by at least 20% by 2024, with 2017 as the reference year for comparison.
• To mitigate stubble burning: a series of short-term ex-situ and in-situ solutionshave been deployed by Union and State governments.
• On-site solutionsinclude Turbo happy seeders and bio-decomposers, while ex-situ solutions include the collection and use of thatch as fuel in boilers, to produce ethanol, or simply burning alongside coal in thermal power plants.
• Other measuresvehicle pollution control teams, public awareness campaigns, investments in mass rapid transit systems and the phasing out of old commercial vehicles.
• Delhi’s ‘Green War Room’ reporting the fight against smog, analyzes satellite data on farm fires in Punjab and Haryana to identify and treat the culprits.
• Cleaner transportation: The government’s recent push for electric vehicles is promising, while industry response and customer buy-in will be key.
• Best Farming Practices-The political will to act is needed, as poor farmers complain that they receive no financial support to properly dispose of post-harvest stubble.
• The Indian Institute of Agricultural Research came up with an inexpensive way to deal with the problem of stubble burning by spraying a chemical solution to break down crop residues and turn them into manure. Better coordination is needed

Conclusion

In the face of environmental growth and health calamity, pollution control efforts are being strengthened. But to succeed, different levels of government must harness the political will to invest more, coordinate across borders, and motivate businesses and residents to do their part.

Air pollution

Q. How is air pollution measured and tracked in India? What are the recent changes introduced in the measurement of air pollution? (15M)

In the context of the WHO air quality guidelines, consider the following statements (UPSC 2022)

1. The 24-hour average of PM2.5 should not exceed 15 ug/m3 and the annual average of PM2.5 should not exceed 5 ug/m3.
2. In a year, the highest levels of ozone pollution occur during periods of bad weather.
3. PM10 can cross the pulmonary barrier and enter the bloodstream.
4. Too much ozone in the air can trigger asthma.

Which of the above statements is correct?

(a) 1, 3 and 4

(b) 1 and 4 only

(c) 2, 3 and 4

(d) 1 and 2 only

In accordance with the new guidelines on air quality levels recommended by the WHO (revised after 16 years), the 24-hour average of PM2.5 should not exceed 15 ug/m3 and the annual average of PM2.5 should not exceed 5 ug/m3. The highest levels of ozone pollution occur during periods of sunny weather and not during inclement weather. PM 2.5 can cross the lung barrier and PM 10 can only lodge inside the lungs. Excess ozone in the air can cause breathing problems, trigger asthma, reduce lung function and lead to lung disease.

Which of the following are the reasons/factors for exposure to benzene pollution? (UPSC 2020)

1. Automotive exhaust
2. tobacco smoke
3. Wood heating
4. Use varnished wooden furniture
5. Use polyurethane products

Select the correct answer using the code below:

(a) 1, 2 and 3 only

(b) 2 and 4 only

(c) 1, 3 and 4 only

(d) 1, 2, 3, 4 and 5

]]>
How much will Halloween candy cost you this year? http://louthonline.com/how-much-will-halloween-candy-cost-you-this-year/ Fri, 28 Oct 2022 21:32:43 +0000 http://louthonline.com/how-much-will-halloween-candy-cost-you-this-year/

The National Retail Federation expects Americans to spend $3 billion this year on Halloween candy. So how much will handing out treats cost you this season? The cost of the 10 most popular candy brands, including assortment bags, has risen an average of 13% since 2021, with some jumping as much as 45%, according to Datasembly’s Grocery Price Index. Skittles leads the pack in inflation, while Nestlé Crunch saw the smallest price increase. Complete list of Datasembly: Bowling: 45% increase Starburst: 35% increase M&M’s: 14% increase Snickers: 14% increase Twix: 13% increase Reese’s: 13% increase Sour Patch Kids: 12% increase Kit Kat: 11% increase Matching bags: 8% increase Sticks: 7% increase Nestlé Crunch: 6% increase In addition, the shrinkage phenomenon has also affected holiday candy. Shrinkflation occurs when manufacturers reduce the size of products instead of reducing the price. The Washington Post reported that a bag of Hershey’s Kisses dark chocolate has shrunk a few ounces, while Cadbury’s milk chocolate bars are about 10% lighter than before. While price can determine what you give away, what cheaters are looking for depends on location. Overall, America’s favorite is Reese’s Peanut Butter Cups, with Skittles and M&M’s following in second and third respectively, according to CandyStore.com. Hot Tamale’s may surprise some, sitting firmly at #5 this year. And the ever-controversial Candy Corn slipped into the top ten, at number ten. Sour Patch Kids are most sought after in New York, Swedish Fish in Georgia, Tootsie Pops and Salt Water Taffy in Tennessee, Lemon Heads in Louisiana while Double Bubble Gum is a hit in Massachusetts. In most places, yours truly is ashamed to love Candy Corn, but apparently I’d call Michigan and Utah home! Click here to find out what’s most popular in your state. Also, if you’re struggling with last-minute costume ideas or trying to avoid looking like everyone else, click here to see which Halloween costumes are most popular in your area. (Props in Cheyenne, Wyoming, where people love to dress up as roller coasters!) Despite rising costs, many Americans still plan to celebrate Halloween, and it continues to be the most lucrative holiday for retailers, after Christmas. Including costumes, decorations and treats, consumers are expected to spend$10 billion this year, up 5% from last year, according to the National Retail Federation. What do these dynamics tell us about the power of prices and investment opportunities in times of high inflation?

Standout prize

Pricing power is the ability to raise prices in order to maintain or increase margins without impeding demand. Or in other words, in times of inflation, a company’s ability to pass on rising costs to customers and keep them buying. In the case of Halloween, people’s desire to indulge in the festivities is enough to overcome some substantial price increases, indicating that confectionery companies have pricing power during the holiday.

More generally, in an environment of volatile markets, supply chain issues, inflationary and recessionary pressures, sectors and companies with more pricing power generally have an advantage over those with less. “We believe that companies with pricing power have the potential to outperform the broader market in the months ahead,” said Chief Investment Office (CIO) strategist Michelle Laliberte.

To determine pricing power, the CIO focuses on companies with high and stable gross margins, but also considers five macroeconomic factors:

1. Competitive Rivalry:

An industry made up of a small number of firms with a competitive advantage will produce higher pricing power.

2. Supplier Power: In the traditional sense, supplier power relates to the ease with which a firm’s suppliers can raise their prices. Generally, if suppliers offer a unique product that is hard to find elsewhere, they have a better ability to raise prices. In a broader sense, supplier power can be linked to pressure on input costs. For example, in a tight labor market, labor suppliers have more bargaining power, and price increases may be offset by higher wages in labor-intensive industries.

3. purchasing power: The power of the buyer is the reverse of the power of the supplier. If there are only a few buyers and the purchasing power is high, a company must be cautious not to lose them. This effectively reduces the pricing power of a firm.

4. Surrogate Threat: The ability to substitute one product for another is an indicator of the elasticity of demand for a product. Inelastic demand indicates that consumer demand is less sensitive to price. Products that have few substitutes have relatively less elastic demand. Branding can help protect against the threat of substitution, at least partially. Some products have multiple substitutes, but a strong brand image or higher quality product can evoke brand loyalty, making consumers more loyal and pricing power higher.

5. Threat of new market entrants: The term “natural monopoly” is used to describe companies or industries that benefit from an “economic gap”. This means that barriers to entry lead to high start-up costs, which makes it extremely difficult for new competitors to enter. Companies that operate in industries with high barriers to entry tend to enjoy higher pricing power, but there are exceptions. For example, regulations may exist to prevent natural monopolies from price-gouging.

There is no universally accepted or perfect way to quantify these five forces, but they are important to consider in today’s economy, according to Laliberte.

Pricing Power Standouts is a tactical equity theme from the CIO and includes a list of 22-name stocks. To view the list and learn more, please ask your advisor for the report entitled US Equity Tactical Themes: Monthly Updatepublished on October 11, 2022.

Main Contributor: Wendy Mock

]]>
The least skilled immigrant workers are not replacing American workers http://louthonline.com/the-least-skilled-immigrant-workers-are-not-replacing-american-workers/ Tue, 25 Oct 2022 23:23:05 +0000 http://louthonline.com/the-least-skilled-immigrant-workers-are-not-replacing-american-workers/

Here we study the economic effects of a large-scale experiment in the United States: the natural, nationwide, firm-level randomization of restrictions on the employment of immigrants for low-skilled jobs. The United States has a primary work visa for low-skilled labor in the nonfarm economy: the H-2B visa. Access to this visa for US employers is limited by a quota and awarded in part via a random lottery conducted by the federal government. This exogenous variation in immigrant employment restrictions allows for unusually transparent and policy-relevant estimates of how American firms and workers are adjusting. After publicly engaging in our hypothesis testing and predicting treatment effects with a pre-analysis design, we collected data from 2021 H-2B visa lottery winners and losers in a new survey of companies. This allows predefined tests of basic theoretical predictions about the magnitude and heterogeneity of the effect of restrictions on the immigration of low-skilled people. It also allows estimation of the “combined” immigrant-native elasticity of substitution at the firm level (Hicks 1936).

We find that exogenous permission to employ immigrants for low-skilled labor causes the marginal firm to increase its output. In other words, the exogenous restrictions on the employment of the number of immigrants maximizing the profit for low-skilled labor lead to the contraction of the marginal firm. These restrictions lead to a large and statistically significant drop in income and investment. The restrictions lead to no increase, or decrease, in the employment of low-skilled indigenous workers and the rate of profit. Losing the lottery reduces the employment of low-skilled immigrants in companies by 56%. This decline causes businesses to contract, shrinking operations with an elasticity of +0.164 for revenue and +1.03 for investment (statistically distinguishable from zero at conventional levels), and with an elasticity of +0.102 for low-skilled American employment, and +0.100 for the rate of profit (statistically indistinguishable from zero at conventional levels).

This is by Michael A. Clemens & Ethan G. Lewis,”The Effect of Low-Skilled Immigration Restrictions on American Firms and Workers: Evidence from a Random Lottery“, NBER Working Paper No. 30589, October 22, 2022.

The methodology is quite clever. Because lucky employers are chosen by lottery, there is no selection bias. This means that Clemens and Lewis can examine job changes for employers who won the lottery and job changes for employers who lost.

The authors put it more succinctly in their abstract:

Firms exogenously allowed to employ more immigrants significantly increase output (elasticity +0.16) with no decrease or increase in US employment (elasticity +0.10, statistically imprecise) in multiple subsamples pre-recorded. The results imply very low substitutability of domestic labor with foreign labor in policy-relevant occupations.

In short, low-skilled Americans are not losing their jobs to low-skilled immigrants.

HT2 Tyler Cowen.

]]>
How to cook vegan: the ultimate guide to egg, milk and butter substitutes http://louthonline.com/how-to-cook-vegan-the-ultimate-guide-to-egg-milk-and-butter-substitutes/ Fri, 21 Oct 2022 16:00:00 +0000 http://louthonline.com/how-to-cook-vegan-the-ultimate-guide-to-egg-milk-and-butter-substitutes/

It’s always a good time to bake cookies, cakes and anything sweet. To demystify buttery and chewy confections, we went straight to one of the sweetest candy experts around.

Francois Costigana virtual vegan baking queen, reminds us to use healthy and quality ingredients which, of course, do not contain any animal products. “Without the butter, eggs, and white sugar,” Costigan says, “I know the ingredients taste fresher.”

An important part of substituting vegan ingredients when baking is understanding the properties of particular ingredients and getting a feel for how everything works together.

Cloth

You do it through testing, as Costigan says. Try cutting a recipe in half and trying it out, then making changes afterwards. Whether you’re planning to spend hours in the kitchen or just want to whip something up quickly, VegNews has the cooking substitution guide for you. Good pastry!

## What is Vegan Pastry?

Traditional baking, unlike vegan baking, relies heavily on animal products. Often, recipes for baked goods call for eggs and dairy products like butter, cream, and cow’s milk.

Annie Sprat

Vegan pastry, on the other hand, omits all animal products. Although cooking without eggs and butter may seem daunting, it’s not impossible. All it takes is getting familiar with the right substitutions, and you’ll be baking cakes, cookies, cupcakes and more in no time.

## Vegan Baking Substitutes

Next time a recipe calls for animal products, try these vegan swaps instead.

## Ban the butter

What he does: In baking, butter adds flavor and a rich, sometimes spongy texture. It also helps baked goods rise evenly and adds both density and softness.

How to replace: Butter is extremely easy to substitute in vegan baking when vegetable butter is not found.

If you’re cooking a recipe that contains spices or a natural flavor, such as gingerbread cookies or gingerbread, olive oil or untoasted sesame oil works well.

nutiva

Unrefined coconut oil (which is solid at room temperature) can add the thickness that butter would, and canola oil works in recipes with liquid sugars (think agave) or solid fats, such as peanuts or chocolates in cakes.

Vegan shortening works well with cookies and pie crusts. And of course, there’s margarine, which creates the buttery taste that so many cookies need.

Delicious butter recipes include:
Chewy Chocolate Vegan Brownies
Peanut Butter Chocolate Chip Vegan Pizza
Pastry

## Move over, milk

What he does: Milk adds flavor and richness and creates texture in cooking.

How to replace: Milk is certainly the easiest to substitute in vegan baking, as many non-dairy milks already exist.

Whole soy milk will help create the richness of whole milk, while rice milk is lighter. Almond milk can sometimes add a subtle almond flavor, as can coconut milk, and both will add richness to a recipe.

To add vanilla flavor, try vanilla flavored non-dairy milk.

For rich recipes without milk, try:
Vegan and gluten-free banana cream cupcakes
Vegan Cinnamon Streusel Muffins

## Eggs squeezed out

What he does: Eggs add moisture and act as a binder in cooking. They are also a leavening agent, helping foods rise during cooking.

How to replace: Milk is perhaps the easiest ingredient to replace, but it is closely followed by egg substitutes.

Ground flaxseed is a popular substitute that is also nutritious – three tablespoons of water to one tablespoon of ground flaxseed equals one egg.

Mixed and measured

Mashed banana and applesauce are other healthy alternatives that completely eliminate the cholesterol that eggs add to cooking.

“Baking powder, baking soda, and vinegar are all good,” Costigan says. And soy yogurt is a creative way to replace eggs and can add a rich texture to your baking, just like mashed black beans.

Here are some delicious recipes that omit the eggs:
Vegan Marizpan Challah
Chocolate Vegan Coffee Scones
Vegan zucchini cake

What he does: Honey acts as a natural sweetener. It also helps brown your baked goods, adds color and retains moisture.

How to replace: Simply grab other viscous liquids, like maple syrup, rice syrup, or agave nectar. They add the same natural sweetness and contribute to browning effects.

Gas Oakley

Costigan recommends cooking them a bit to simmer some of the water out to create a thicker syrup.

Recipes that use these natural sweeteners include:
Profiteroles Vegan Boozy Holiday
Blueberry Banana Vegan French Toast Casserole

## Can it, cream

What he does: Cream creates a smooth and sometimes fluffy texture in baked goods. It adds richness and can impart a satiny quality.

How to replace: The richness of coconut milk can advantageously replace cream. For a homemade replacement, mix one part cashews and one part water until smooth.

Vanessa K.Rees

There are also a variety of creamers and non-dairy creamers on the market.

For creamy treats, try:
Vegan chocolate eclairs
Almond Butter Vegan Pudding Pie

Want to know some of VegNews’ favorite products for vegan baking substitutions? Continue reading!
Agave
egg substitute
Maple syrup
raw cashew nuts
rice syrup
Unrefined coconut oil
Vegan shortening

##### For more vegan cooking tips, read:
]]>
Seismic Velocity Changes in the Groningen Reservoir Associated with Remote Drilling http://louthonline.com/seismic-velocity-changes-in-the-groningen-reservoir-associated-with-remote-drilling/ Thu, 20 Oct 2022 11:46:16 +0000 http://louthonline.com/seismic-velocity-changes-in-the-groningen-reservoir-associated-with-remote-drilling/

As mentioned earlier, the temporary decrease in the travel time of the P waves to geophone 10 and the increase in the delay time of the PS waves could be explained by a temporary upward movement of the gas-water contact (GWC). Here we explore this interpretation in more detail.

### GWC elevation from seismic observations

The variation of seismic velocities due to the substitution of gas by water in a porous sandstone can be calculated with Gassmann’s model of fluid substitution25, 26. The bulk modulus of a fluid-saturated rock is related to the porosity and bulk moduli of the mineral matrix, interstitial fluid, and dry rock framework. The mass modulus of the fluid will increase in case of gas-water substitution, which will increase the effective mass modulus of the rock, and therefore the velocity P. The shear modulus, on the other hand, will not change because it mainly depends on the solid rock structure. However, due to the small increase in density, a slight decrease in S velocity is expected.

Quantitative estimation of the change in GWC level using the Gassmann model would require accurate values ​​of the mass and shear moduli of the matrix, fluids (gas and brine), and rock framework for the local rock. . Since these estimates are unknown and approximate would have large uncertainties, we took a more practical approach. We estimated the average P velocity above and below the GWC from sonic logging data and found P velocities of 3321 m/s and 3688 m/s, respectively (Supplementary Material, Section 5) . Assuming these values, a decrease in P-wave travel time of 0.7 ms would correspond to a rise in GWC of 23 m. We further investigated whether this rise in GWC could also explain the increase in PS lag time ((Delta (t_{PS}-t_P)) (simeq) 1.0ms). Assuming a 23 m GWC offset, an increase in S-wave travel time of 0.3 ms, vertical propagation with an S-wave velocity of 2000 m/s for sandstone with gas12, we find a decrease in the speed of the S wave of only 52 m/s (2.6%). This decrease would be solely the effect of the increase in density on the shear rate caused by the replacement of gas by water. Although the values ​​seem realistic, it should be noted that the uncertainties are large and that 23 m should only be interpreted as an indication of the elevation of the GWC inferred from our measurements.

The other observation is the rapid decrease in the noise level, as well as its rapid return to the normal level observed for geophone 10 compared to the other geophones (Fig. 4). These rapid changes are easily accomplished by changes in the level of the GWC, although it is unclear how this would reduce the noise level.

### Relationship to drilling operations at Borehole ZRP-3

If a temporary elevation of the GWC can explain the seismic observations, the question arises as to what caused the elevation of the GWC. Gas production data in the area has been verified, but does not show any correlation with our data. Since the timing of the anomaly appeared to be correlated with the drilling of the ZRP-3 well at a distance of 4.5 km, we reviewed the detailed drilling report provided by NAM.

Drilling started on May 23 (2015) and the reservoir was reached by drilling in the Ten Boer clay on July 13 (Fig. 3a). Downhole drilling mud losses occurred on July 18 and the early hours of July 19. Deeper drilling took place for limited periods on single days between July 23 and August 21 when the maximum depth of 3284 m was reached. GWC depth was reached on July 31 and Carboniferous Shale was drilled on August 11. The cementing of the borehole took place on August 28 and 29 and the borehole was left on August 30 after the cement hardened.

The first conclusion is that there is no correlation between our observations and actual drill intervals. First, the drilling and coring periods were scattered over time, while our observations show a trend over a month of generally decreasing travel times (Fig. 3a). Second, borehole noise would affect travel times between all geophone pairs, but this is not observed (Supplementary Material Fig. S3). Thus, drilling noise cannot explain the observations. We also considered downhole losses that occurred while drilling in Ten Boer clay. However, these downhole losses began 30 hours after the anomalous observations began.

A more likely cause is pore pressure variations caused by drilling. NAM provided us with downhole static pressure data ((BHP_s)), calculated from the depth of the borehole (h), the density of the drilling mud ((rho _m)) and gravitational acceleration (g): (BHP_s=rho _m gh). These data are shown in Figure 3a. Note that (BHP_s) represents only a portion of the total downhole pressure at the wellhead (BHP) because the effects of ram pressure are not included. Rapid decrease of (BHP_s), for example between July 19 and 23 related to mud losses, will have been dynamically compensated by the circulation of the drilling fluid to stabilize BHP. From July 23 to August 19, when drilling depths increased from 2919 to 3267 m, there is a gradual increase in (BHP_s) from 36 to 39 MPa, as indicated by the dotted blue box in Figure 3a. The progressive trend of increasing BHP is anti-correlated with the decrease in travel time of the P wave from geophone 8 to 10 and correlated with the increase in the PS delay time at geophone 10 (Fig. 3a,b ).

Assuming that our anomalous observations at SDM-1 are related to downhole pressure (BHP) at ZRP-3, it is likely that they are related by changes in pore pressure. An elevation of the GWC of (sim) 23 m would correspond to an increase in the pore pressure in the aquifer part of the sandstone of (sim) 0.23MPa ((Delta P = rho _w g Delta h)). By relating the start and end times of the drilling operations in the reservoir to the start and end times of our anomalous observations, we calculated the time it took for the pressure front to propagate from ZRP-3 to SDM- 1. Drilling in the Ten Boer Clay took place on July 13, between 7:45 a.m. and 5:00 p.m., while anomalous seismic observations began on July 17 at (sim) 00:00 (Fig. 4a). This gives a delay of 3 days and 7 to 16 hours. A similar calculation can be made for the end of the abnormal period. The hardening of ZRP-3 cement took place on August 30 (00:00-07:30). After the cement hardened, the well was sealed and there was no further influence from drilling operations. Combining this with the end of the anomaly at SDM-1 on September 2 at 7:00 p.m. (Fig. 4b) gives a delay of 3 days and 11.5 at 7:00 p.m.

### Diffusion of pore pressure

From our seismic observations and their correlation with downhole pressure at ZRP-3, it is inferred that variations in pore pressure may have caused changes in the GWC level in SDM-1. Next, it should be verified that the pore pressure diffusion process can explain the delay between the drilling of the reservoir at ZRP-3 and the GWC response at SDM-1 at 4.5 km distance.

In case of isotropic and spherical diffusion, the hydraulic diffusivity (D) associated with pore pressure diffusion in a fluid-carrying porous medium can be estimated from time (you) it takes the pressure front to reach a certain distance (r)27

begin{aligned} r = sqrt{4 pi D t}. end{aligned}

(1)

The pore pressure diffusivity is estimated from the propagation time of the pressure front, given the time frames of 3 days and 7–16 h (beginning) and 3 days and 11.5–19 h (ending). The greatest times (3 days and 7 p.m.) and the smallest (3 days and 7 a.m.) give diffusivities of 4.9 m(^2)/s and 5.7m(^2)/s, respectively.

An independent estimate of the hydraulic diffusivity (D) can be calculated from material properties, including average porosity (0.1528) and permeability (120 mD29) with more details provided in the “Method” section. We find a diffusivity of the pore pressure of 3.9 m(^2)/s, which is similar to, although somewhat smaller than, the previously estimated diffusivity range of 4.9–5.7 m(^2)/s. For this diffusivity range, permeabilities of 151 to 176 mD are required, somewhat higher than our adopted value of 120 mD, but within the wide range of 1 to 1000 mD measured for the Groningen gas reservoir30. Thus, it is concluded that the diffusion of the pore pressure in the aquifer part of the reservoir can explain the delay between the overpressure at ZRP-3 caused by the drilling and the change of the GWC at SDM-1.

Groningen Reservoir is heavily faulted, and faults can either act as barriers or effective conduits of pore pressure depending on direction: permeability is generally high in the damaged zone parallel to the fault and low across the fault31. The NAM fault map for the top of the reservoir (Fig. 5a) shows a fault with an offset of approximately 150 m midway between SDM-1 and ZRP-3 separating two compartments of the reservoir with the pits on either side and d other (Fig. 5b). This defect is likely to hinder the direct diffusion of the pore pressure through the gaseous parts between the two compartments. On the other hand, the (sim) Change in level of 20 m from the level of the GWC at 4.5 km distance from the drilling site and the high diffusivity ((sim) 5 meters(^2)/s) suggest a high permeability conduit between the two locations. The NAM fault map does not show a connecting fault, although speculatively there are two ENE-WSW trending fault segments that could be related to the bottom of the reservoir (Fig. 5a).

It is important to know that SDM-1 is an open top and bottom and perforated well at reservoir depths between 2965 and 2995 m. Loading a column of high-density brine inside the well prevents reservoir gas from flowing inward through the perforations. Since the well is an open system, it is sensitive to variations in hydrostatic pressure in the reservoir. Our speculative hypothesis is that the pressure front, caused by overpressure at the far wellhead and propagated through the aquifer portion of the reservoir, reached SDM-1 and moved the brine column upwards, raising the water table. in the well. Following the level change in the perforated well, the GWC in its immediate vicinity was also elevated, which was detected by seismic data.

While we realize that parts of our interpretation are highly speculative, we have been unable to find another plausible explanation. Nevertheless, it seems obvious that the observations are related to distant drilling, an unexpected effect and which may be important for other drilling activities.

]]>
How mean are politicians? This book says “very” http://louthonline.com/how-mean-are-politicians-this-book-says-very/ Mon, 17 Oct 2022 08:00:00 +0000 http://louthonline.com/how-mean-are-politicians-this-book-says-very/

The second volume in Bryan Caplan’s series on his EconLog blog post collections asks How bad politicians are? Libertarian Caplan’s answer: very, although some people might be more inclined to use words like “naive” or “irresponsible.” Caplan, who holds himself and others to very high moral standards, is deliberate about his use of the word “evil” for a captioned volume Essays on demagoguery and wearing a blanket reminiscent of George Orwell 1984 in mind. This is obviously intentional; Orwell’s influence is apparent throughout this book, and Caplan is explicit on p. 56: “George Orwell had a huge influence on me.”

But aren’t politicians just naive, or perhaps irresponsible? Their naivety and irresponsibility makes Caplan’s eyes hurt. It sets a very high epistemic bar for the aspiring philosopher king who wishes to order others, even for their own good. He is right to do so. I quote Adam Smith:

“The statesman who would attempt to direct individuals in the manner in which they should employ their capitals would not only burden himself with the most unnecessary attention, but would arrogate to himself an authority which could be safely trusted, not just to anyone in particular, but to no one. any council or senate, and which would nowhere be so dangerous as in the hands of a man who has enough madness and presumption to believe himself fit to exercise it. (The Wealth of Nations IV.2.10).

Caplan explains how “statesmen” (and stateswomen) time and time again fail to prove themselves worthy of the authority and power they crave. As Caplan says, “If you are able to pass or enforce laws, lives and liberty are in your hands. Common decency requires you to act with extreme moral concern at all times, always aware of the possibility that you are violating the rights of morally innocent people” (p. 8). Moral and epistemic standards are higher if you believe you can direct the lives of others.

Politicians can be sincere, but sincerity is no substitute for understanding. As libertarian author Sheldon Richman explained, doing economic policy while totally ignoring basic economics is the intellectual equivalent of drunk driving. Doing it once is irresponsible. Driving drunk repeatedly and unrepentantly through school zones as the kids go out might earn the word electronic.

Like with Labor Economics vs. the World, How mean are politicians? is divided into four categories. Part I explains how “evil rules the world”. The second part introduces us to A Litany of Evil. Part Three lays out Caplan’s “pragmatic pacifism,” which I’d really like to see him explore in a serious, scholarly book-length treatment. Finally, Section IV asks the question “How good is freedom?”

“Demagogy“says Caplan, “is the politics of social desirability bias(p. 18, emphasis in original). He then describes “The Heart of Social Desirability Bias: Certain Types of Claims sound right or wrong regardless of the facts.” Social desirability bias is embedded in the names of many regulatory bodies and pieces of legislation. Who could be against Equal Employment Opportunity? Or Fair housing? Or Reduction in inflation? Social desirability bias builds philosophical and political systems on the sandy ground of wishful thinking. People’s opinions on things like the minimum wage, for example, are informed by secret mental substitutions. People who think about minimum wage may not understand “the elasticity of labor demand” and so “they mentally substitute easier questions like, ‘Would I be happy if employers gave workers a raise? unskilled? (p. 36). Mental substitution then makes it easy to demonize minimum wage skeptics by deducing that they would be sad if low-skilled workers got raises.

I see this quite regularly in public discussions of “sweatshop” work. Too often, anti-sweatshop crusaders seem to think that the “advocates” of sweatshops believe that sweatshop labor is a cosmic good for the people who do it, as they no longer deserve and could be sweetened by higher wages. higher and better working conditions. The real argument is that sweatshops are often the best of the really bad alternatives, so shutting them down actually makes the situation worse for the workers themselves. This, however, does not lend itself to effective grandstanding.

Caplan’s argument for “pragmatic pacifism” applies his analysis of demagoguery and incitement to war. Many arguments for war seem to go no further than wishful thinking that the world would be a better place if terrorism, racism and other horrors simply disappeared overnight. I can’t imagine anyone condoning serious disagreement. If I could get rid of terrorism with the snap of my fingers, I would. Of course, that’s not how war works. Caplan explains that the costs are immediate and horrific while the benefits are much later and extremely uncertain. Too often, future benefits are simply desired and belligerents do not always foresee what will happen the day after their victory. Caplan makes this point in reference to one of the bloodthirsty characters in game of thrones:

“He has no master plan to bring great good out of great evil. Instead, he has a master plan to do great evil, driven by vague wishes do great good. Proverbially, however, if you don’t plan, you plan to fail.

The same criticism applies, he argues, to American leaders who were unsure of what would happen after the overthrow of Saddam Hussein or Muammar Gaddafi. How mean are politicians? does what Caplan has done so well over the years: challenges our wishful thinking and pleasurable fictions with calm, cold analysis and an insistence on comparing what we hope for to what we can reasonably expect – and, therefore , he offers wise advice: stop listening to demagogues.

]]>
Energy Shocks Can Have Perverse Consequences http://louthonline.com/energy-shocks-can-have-perverse-consequences/ Thu, 13 Oct 2022 10:24:53 +0000 http://louthonline.com/energy-shocks-can-have-perverse-consequences/

The now dismantled dth-nul-energihus on the outskirts of Copenhagen offers a vision of a future that never materialized. Built during the 1973 oil shock by the Technical University of Denmark, this chunky white building, consisting of two living spaces separated by a glass atrium and topped with a spine of solar panels, was one of the first attempts to create a zero- energy house.

The null-energihus didn’t quite reach “zero energy” but his vital stats were impressive nonetheless. It only needed 2,300 kilowatt hours of energy per year, roughly the same as six modern refrigerators. Its copious insulation and solar heating system kept it warm even in the freezing Danish winters. When a family moved in, things went downhill a bit, notes Marc Ó Riain, professor of architecture at the Technological University of Munster. Hair clogged the filtration system, which recycled heat from the sewage, and the occupants had the annoying habit of leaving the windows open.

Yet these are problems that could have been overcome. The house was almost ready for prime time. In the years since, scientists have shown that well-targeted research and development expenditures can quickly improve quality and reduce costs (see, for example, recent improvements in electric cars and solar panels). So why couldn’t a solarpunk future of clean energy abundance happen in the 1970s? And as the world faces a new energy shock, what lessons can we learn from its failure?

Economists believe that technological progress is the ultimate engine of growth. The key question is what determines the direction of this progress. In 1932, John Hicks, an economist, launched the debate on “directed technical change” when he hypothesized in his book “The Theory of Wages” that an increase in the price of a certain factor of production – work, in his example – would stimulate innovation to lower its cost. In the century before his book was published, wages had risen steadily, which meant that employers had an incentive to invest in labor-saving technologies rather than capital-saving ones. In this logic, a surge in the price of fossil fuels should contribute to accelerating decarbonization.

Such green growth, however, is not inevitable. Daron Acemoglu of the Massachusetts Institute of Technology pointed out that research spending can be directed either towards clean substitutes (like solar power) or towards complements to dirty technologies (like more efficient motors). For a company, the choice of where the money goes depends on the sometimes competing forces of price and market size. An oil shock, which increases the price of fuel, makes green technologies such as solar more attractive. But the extremely widespread use of hydrocarbons can make investments in fossil fuel efficiency, known as gray technologies, more profitable.

That’s pretty much what happened in the 1970s. Although some of the money was spent on projects like Denmark’s Zero Energy House and in the embryonic renewable energy market, much more was devoted to gray technologies. Research by Valerie Ramey of the University of California, San Diego and Daniel Vine of the Federal Reserve shows that the main way historical oil shocks have affected the US economy is by encouraging consumers to buy more fuel-efficient vehicles in fuel. The economy of a typical American car went from 13 mpg in 1975 to 20 mpg in 1980.

Rather than pocketing the savings offered by more fuel-efficient cars, Americans instead bought even bigger ones and in greater numbers. So the long-term impact of the oil shock was not to kill the country’s car culture, but to embed the combustion engine even deeper into American life. By the mid-1980s, oil consumption was higher than a decade earlier, even though many of the nation’s power plants had switched to natural gas.

Environmental economists call this phenomenon, where fuel-saving measures perversely increase demand, the “rebound effect”. Something similar happened in Danish housing. Better insulation improved its energy efficiency; as a result, homes became larger and their owners more accustomed to higher temperatures. It has become common, for example, to wear t-shirts indoors during the winter. According to official statistics, the total energy consumption of dwellings has remained unchanged over the past three decades.

Acemoglu argues that there is a “path dependency” in technological progress. Energy efficiency can make it more difficult for other technologies to compete. A well-insulated home with a state-of-the-art gas boiler uses less fuel. But that makes the initial investment of an electric heat pump less attractive. If European industry manages to maintain its production this winter while using less gas, it could, in the future, have less incentive to switch to green methods.

Differences from previous energy shocks give cause for optimism. Economic modelers refer to the “elasticity of substitution” as the critical measure of whether expensive fossil fuels accelerate the adoption of green or gray technologies. Encouragingly, this elasticity has increased since the 1970s. Today, rising prices are expected to incentivize more shifts away from fossil fuels than in the past, thanks to the greater availability and lower cost of green alternatives.

green thumbs

In addition, carbon prices put the government’s thumbs on the scales. The cost of a permit in the EU’s cap-and-trade system is only expected to increase in the future as the cap on the amount of emissions drops, meaning companies have an incentive to get ahead. With any luck, this will limit the rebound effect in the years to come. But America is heading down a different path. Subsidizing clean tech rather than taxing dirty tech — the strategy adopted by President Joe Biden’s recent Cut Inflation Act — doesn’t do as much to replace fossil fuels. A family can, for example, buy a subsidized battery-powered vehicle, but only in addition to a fossil fuel vehicle, which they can continue to drive without penalty. Policy design matters if a carbon-free world is to become more than just a future that never happened.

Learn more about Free Exchange, our column on the economy:Why Chinese policymakers are relaxed about the yuan’s fall (October 6)Economists now accept that exchange rate intervention can work (September 29)Chinese leaders seem resigned to slowing economy (September 22)

From The Economist, published under licence. Original content can be found at https://www.economist.com/finance-and-economics/2022/10/13/energy-shocks-can-have-perverse-consequences

]]>
A look at Michael E. Porter’s competitive edge if written for 2023 and beyond http://louthonline.com/a-look-at-michael-e-porters-competitive-edge-if-written-for-2023-and-beyond/ Tue, 11 Oct 2022 23:15:42 +0000 http://louthonline.com/a-look-at-michael-e-porters-competitive-edge-if-written-for-2023-and-beyond/

Michael Porter is the father of the MBA and strategic design. I had the good fortune to work as a group partner in his firm Monitor Group (now part of Deloitte’s consulting firm) and more importantly much of what he talked about in his text” Competitive advantage, creating and sustaining superior performance,” lies at the heart of the tens of millions of people around the world who have an MBA. It is even at the heart of how nations can create and maintain competitive advantage.

His ideas are Newtonian in their importance and have stood the test of recessions, wars, Web 1.0, Web 2.0 and perhaps Web 3.0. Its simple pentagon of pressures or areas of competitive power, supplier power, buyer power, competitive power, threat of substitution and threat of new entrant is the simplest exercise you can do as an exercise in compromise on a whiteboard.

However, we must ask ourselves if they are the right way to think about the underlying needs of strategic compromise in a digital world where the speed and dimensions of change are frenetic, sometimes irrational and can cause explosive advantages (Zoom or Platoon in Covid) or calamitous downsides (Twitter and Netflix controversies over actual follower counts)?

We live in extraordinary times that must challenge the rhythmic simplicity of trade-offs that Porter’s Five Competitive Forces entail.

We know how Amazon has redesigned almost everything. How Google opened the door to near-instant insights and how Tesla rewrote the rules of the auto industry. However, it’s not just the idea of ​​doing it once, from selling books online to streaming EPL and NFL video for Amazon Prime. companies that can jump from one category to another, over and over again and succeed will be the winners.

We also know that persistent supply chain issues, inflation, labor shortages and a rapid reversal in unfettered global trade will be more pronounced in 2023 than in 2023. We cannot think as we did in 2022 in the world of 2023 and expect different results.

In The digital propeller (The Wall Street Journal’s best-selling strategy book) a very important DNA metric was an 87% predictor of extraordinary economic returns. The idea that those who adapt to the idea of ​​living in a constantly changing world are very capable and that their results (changes in OPEX, reductions in CAPEX, margin growth, revenue growth and capacity for innovation ) are three times better than all other companies that index only 15% less than they do on a constant variation index. Just spend a second chewing that one.

Now is the time to change that simple Porter pentagon and look at five new engines of competitive management, see below. An idea, the ability to manage constant change at top performing levels is the primary indicator of economic success in turbulent times. There are, however, five forces underlying this ability to manage constant change which I call SHAPS

Note that SHAPS in machine learning is a mathematical method to explain the predictions of machine learning models, or actually it is the idea of ​​how to invert the output of a predictive algorithm .

It represents Shapley’s additive explanations (Lundberg and Lee – 2017). Dive deep into the DNA of successful digital businesses as they adapt and perform over time. We have known this to be empirically true since 2017.

Strength One:

Signal management boosts pattern recognition

Tell me something you can’t find online? Information is all around us (some true, some false). Seeing through this noise to find signals is vital. Develop models to evaluate it (read more about this in The digital propeller – DNA of Theme and Streams) is going to be an increasingly real strategic differentiator. Should Peloton have followed measures on returning to the office and Covid infection rates rather than pumping out more bikes? What signals did they miss? Should Zoom have found ways to measure its users’ desire to use more or different services from them during the transition from Covid as a crisis to Covid as a way of life? Is Facebook good or bad for signal management? Signal management is really important if you are open to finding new sources or new ways to look at information. Pattern recognition is often considered a sixth sense. This is not the case if you learn to chase them, aggregate them and draw patterns from them.

Managing signals at all levels of the organization, from social media to influencer feedback in large and complex purchases, is the fundamental strength. It’s debatable whether you can stop a smarter competitor if they handle signals better than you.

Force Two:

Agile decision making is iterative, not final

There is not one right move. It’s not just supplier power or customer power in the old model. Imagine a world where the curve of time is long and moderately predictable, like a gentle wave. Now imagine a world with short waves, occasional tsunamis, tidal bores and sudden Bermuda Triangles can appear. Each of these requires different types of decisions. The ability to be agile in how you make decisions is key. It’s not all or nothing, but maybe fast, slow and change of direction. It’s 100% impossible to predict the future, but it shouldn’t be 100% impossible to be able to navigate it.

Iterative decision-making should be guided by the idea of ​​”in-game decision-making and choice”, not by a definitive path with a binary set of choices. For force two to work, force one, signal handling is key

Force Three:

Elasticity of customer permissions matters

Brands are struggling to maintain their historic strengths and dominance when there are so many choices for customers. Business customers and consumers face more ambiguity about choices than ever before. They are more open than ever to looking for alternatives, not just businesses, but ways to solve problems that are unsolved or may soon be. If you don’t know how open your customers are to new relationships with you, I call it the elasticity of permission.

Knowing what it is will be key to knowing how you work on your opportunities for new services, products, or even conversations. If you don’t constantly talk with customers about their challenges and opportunities, someone else will. Signal management, agile decision-making, and recognizing that the ever-changing world surrounds your customers also changes the way they think about what’s possible for them, too.

The elasticity of customer permission is a life force that you must apply. If you don’t, others will open the window of opportunity with them.

Strength 4:

Supply no longer means a chain

We get products delivered right to our door in minutes, even \$7 Taco Bell orders in my zip code. Snapshot is going to be the new imperative and that means supply chains have to be so compressed to the point that they hardly exist anymore. The next few years and the last two years have illustrated and will illustrate how painful the chains are, so they must be broken.

This idea is as much about people as it is about delivering products and services. This is one of the main reasons why software-centric companies that are in constant connection with their product and their customers in a virtuous circle (like a three-circle Venn diagram) can scale potential economic returns almost to will based on force one, signal management, force two, agile decision making, force three, client authorization access. Ideas like intelligent systems and big concepts like machine economics and edge computing revolve around the idea of ​​constantly connected learning systems that adapt to reduce the elements of the learning chain. near zero supply.

Force Five: