Antivirus 2009 benchmark

Help with your Windows PC Issues

2014.02.11 06:01 monkeychef Help with your Windows PC Issues

From blue screens to systems that won't boot, we can help you with every issue that you may come across. We specialize in PC's and the Windows operating system. Please visit a more specific subreddit if you are looking for help with your Apple computer or a specific piece of software.
[link]


2024.05.16 13:14 OTH14 Arsenal - PGMOL Analysis and Incident Tracker

Fellow Gunners
Since the start of the 2022/23 Premier League season I have been building a simple database tracking refereeing incidents (goals, penalties, bookings) specifically to Arsenal, to see if there are any useful insights that can be gleaned from this developing dataset. Note there will be inclusions and exclusions that will provoke debate etc. as there will be subjective elements, however having a record allows for a baseline to develop and benchmark officiating performances.
Over time it would be great for other fanbases/clubs to follow suit, to conclusively highlight the issues with officiating, rather than these incidents being routinely swept under the rug with vague apologies or anecdotal and non-directional explanations. Consider Wolves this season or Brighton last season who had several woeful decisions against them, or even Manchester United's disallowed goal against Brighton this season which had similarities to the Antony Gordon goal awarded against us.
To date as we approach the end of season 2, there are multiple instances of inconsistency. The data is beginning to show patterns to the officials i.e. Chris Kavanagh who over the past 18 months awarded 2 contentious penalties against Arsenal (Leeds United 16 October 2022, and Chelsea 21st October 2023) yet dismissed similar penalty claims (with VAR) for Arsenal (Bournemouth 4th March 2023 and against Luton 5th December 2023).
Every goal counts, and officiating mistakes unfortunately do not tend to even out, given the inherent fragility of Game State within football. Despite me being no fan of Manchester United, in the 2009/2010 season, Chelsea via Didier Drogba scored an offside (winning) goal which was pivotal in deciding the title race.
The results are striking over a 5 year period, the results for the team who scores first is as follows:
Win 65%, Draw 16%, Lost 12% and No Goals 7%.
Game state is affected by goals and officiating decisions, and ultimately the longer the officiating decisions are left unaccounted for the worse the problem will get.
Given all other metrics are tracked on the field, player biomarker data, expected goals/assists, progressive passes, packing rating et al, and how the inconsistent application of officiating can impact a game's (and season) outcome, is now not the time to find ways to quantify (and hopefully mitigate) the impact?
In March Kai Havertz was deemed a second yellow for simulation against Brentford which raises several serious questions around the efficacy of the review based on ensuing and past incidents which are covered excellently in Arseblog and more recently today. Therefore rather than decry the status quo, it makes sense to document and provide quantified accountability to improve the game we all love, and dispel biased narratives with insight. Removing VAR would be a backwards step, actually learning from mistakes rather than ignoring them is the way in order to address the inconsistency.
LINK to interactive dashboard.
TLDR: Officiating decisions need to be tracked across the Premier League to drive out inconsistencies, false narratives and improve the game. VAR in the Premier League is not the issue it is the application of the technology by officials. To date since the 2022/23 season across 75 matches and 131 incidents, 33% of decisions go in the favour of Arsenal, see dashboard link.
submitted by OTH14 to Gunners [link] [comments]


2024.05.16 00:29 Cak3orDe4th Just upgraded PC and experiencing micro-stutters while gaming. Need advice.

Recently upgraded my build to the following:
Mobo: ASUS ROG MAXIMUS Z790 APEX ENCORE
CPU: Intel i9-14900ks
RAM: G.SKILL Trident Z5 RGB Series 48GB (2 x 24GB) 288-Pin PC RAM DDR5 8200 (PC5 65600)
GPU: GIGABYTE Gaming GeForce RTX 4080 16GB GDDR6X PCI Express 4.0 x16 ATX Video Card (used in my last build)
PSU: Thermaltake Toughpower GF3 1650W
Cooler: NZXT 360 Elite
Case: Hyte Y60
Fans: LianLi SL120 (3 on top mounted radiator (exhaust), 3 on side (intake), 1 on back (exhaust), and 3 120mm NZXT fans on bottom as intake.
Disks:
Primary -Samsung EVO 970 Plus (installed on m.2_2 on the motherboard because M.2_1 was causing PCI_1 below x16).
Others - Samsung 860 EVO and Seagate FireCuda 3.5 SSHD (ST2000SX002-2DV164) both in via SATA.
OS: Windows 11
Micro stutters will start in game after gaming for a while or after computer sits idle for a while. A reboot will fix it temporarily. They also will randomly go away even without a reboot sometimes.
Nothing is overheating as I monitor the temps of everything while gaming:
CPU = 68c (sometimes lower), GPU = 60c, Ram = 38c, Evo 970 = 46c (only drive used for gaming).
I don't know what to do to fix this issue. I've disabled gamebar completely using youtube guides and I've removed temp files. I've set the CPU to stock limits/recommended settings by Intel in the BIOS, updated the BIOS, and XMP profile 1 (8200) is enabled in BIOS for the RAM.
I've installed Samsung NVMe drivers, I've tested the 970 plus using Samsung's software and it came back fine. I ran the performance benchmark and I did notice the sequential write speed was slower than the read speed by about 2000 one run and about 1500 on another. Most recent run shows Read 3468 and write 3062. Samsung Magician shows drive as healthy.
I'm out of ideas here because the system is stable other than the micro stutters that show up. Is my EVO 970 going bad? Is there something in Windows 11 causing this? Is it antivirus software (BitDefender)?
I need advice on this...it's driving me insane.
submitted by Cak3orDe4th to buildapc [link] [comments]


2024.05.15 14:48 PunnyHeals Misrepresentation of house price data by Realtor.ca

Misrepresentation of house price data by Realtor.ca
TL;DR:
  1. Realtor.ca claims the average sale price of a house in Fredericton, NB is $288,300. My own calculations point to the average being approximately $543,878.
  2. Realtor.ca most likely calculates it average house price using the average for houses and vacant land. My average for houses and land was $288,540, only a $240 difference, making this the most likely explanation.
  3. Realtor misrepresents graphs and averages through market capture, pay gating, and could be violating the Competition Act.
**Background**
I have been looking to buy a house for the past several years in the Fredericton area and have been checking the online listings regularly through Realtor.ca since it is the most common real estate listing website used in New Brunswick. What I liked about Realtor.ca was its ability to provide the average sell price for a house every month with graphs that showed the average sell price for a house in Fredericton for the past 12 months and 10 years. Looking for a house for an several years, I felt that I had a good idea of the market conditions and price ranges. My anecdotal evidence was that the average house price was much higher than Realtor.ca’s estimate of 288,300. I wondered if my anecdotal evidence could be supported by data.
The objective of this report is to collect list price data from all available listings within the Fredericton area. Once collected, I can take the average price and see if it matches the average price shown by Realtor.ca.
**Average/Median Methodology**
When you use Realtor.ca, you can filter results by the property type. There are six property type categories: Residential (single family home), condo/strata, vacant land, recreational, multi-family, and agriculture. For each of these property types, the asking price and address were copied into an Excel file. The data was collected on May 10, 2024, and included all listings within Fredericton; duplicate listings were removed.
Once all data was collected, the average and median for each property type was calculated (Table 1). I compared my calculated average to the Realtor.ca average to determine if my anecdotal evidence of thinking the average house price was higher than what Realtor.ca said was justified.
**Results**
There were 107 listings for residential houses (referred simply as “house” in this report), 245 listings for vacant land, 5 listings for recreational, 7 listings for multi-family, 2 listings for agriculture, and 10 listings for condos (Figure 1).
The average listing price was $543,878 for houses, $177,026 for land, $227,080 for recreation, $826,100 for multi-family, $829,450 for agriculture, and $317,410 for condos. The median listing price was $474,900 for houses, $64,900 for land, $229,900 for recreation, $799,000 for multi-family, $829,450 for agriculture, and $289,900 for condos (Table 1).
**Realtor.ca MLS System Average House Price Claim**
When you search for “houses for sale in Fredericton, NB”, you will see the top search results show Realtor.ca. This is not uncommon since Realtor.ca and its Multiple Listing Service (MLS) have the highest number of listings of any other online real estate listing service for the Fredericton, NB, area. Having most real estate listings concentrated on one system can provide users with a general idea of greater market conditions beyond individual listings, such as averages and trends for cities. Realtor.ca provides this data in the form of “Market Price (CAD)” price trends for the past 12 months, and price trends for the past 10 years (Figure 2). These figures are prominently displayed at the end of the first page of the Fredericton real estate listings (URL: https://www.realtor.ca/nb/fredericton/real-estate).
This leads us to the first claim by the Realtor.ca MLS system claim and our initial objective of this report.
Claim: The average market price in Fredericton sits at $288,300 as of May, 2024.
Analysis: When a user views these figures, it is a safe assumption that when a price is displayed, the user is inclined to believe that “Market Price (CAD)” is the average house price in Fredericton. This is further reinforced if the user reads the description above the figures which states:
“Use our home price trends to better gauge local market conditions and plan your next move. The graphs below show benchmark or average prices of homes sold in the area. Data generated by MLS® Systems and the MLS® Home Price Index (HPI) — Canada’s most advanced tool to gauge local home price levels and trends.”
This small paragraph specifically states, “The graphs below show benchmark or average prices of homes sold in the area.” Based off the graphs and their statement, we can safely interpret that Realtor.ca is explicitly saying that the average home price in Fredericton, NB, currently sits at $288,300; leaving no room for interpretation on how the data can be viewed. The reason I wanted to be explicitly clear on this thought process is that if you look back at the results section of this paper (Table 1) and see that the calculated average of all house listings was $543,878, it represents an 88.65% difference. A couple assumptions that could explain this difference are:
  1. The listings used in the analysis are only a snapshot in time and could not represent an accurate or precise representation of the monthly price average.
  2. Houses that were listed below the average could be selling more quickly, giving us a skewed data set that is not representative of all listings that have been posted.
  3. Realtor.ca gives the average sell price for houses in Fredericton and not the average listing price. There could be a large discrepancy between sell price and list price, resulting in my calculated average being inflated.
The three assumptions made above introduce bias into my conclusions, but given the magnitude of those differences, it could be reasonable to assume there might be an alternative reason causing these discrepancies.
Since there is such a large discrepancy in my calculated average and the average from Realtor.ca, I expanded my analysis to other categories. I combined my residential house data set with the other five property types to see if it would alter our initial average and how close it would come to the calculated Realtor.ca average (Table 2). Realtor.ca claims the average house price in Fredericton was $288,300, which seems to be closest to my calculated average for the combination of house and land listings. With the addition of these combinations, it suggests that Realtor.ca calculates average housing price using houses and land listings.
Realtor.ca MLS’s claim of the average house price in Fredericton, NB being $288,300 is a misrepresentation of the true market value and conditions. If a company were to calculate averages of an entire real estate market within an area, why would they only include house and land and not the other 4 categories?
**Misleading Representations by Realtor.ca**
The conclusions made from my analysis were made with plenty of explanations and assumptions. Given that the MLS system is a pay gated system, and their patented house price index algorithms are private, I feel it is reasonable to assume that my data is closer to true market prices. This leads us to the next question, if my data isn’t correct, why are the figures, calculations, and methodology misleading users on market conditions? The average user is not going to spend a significant amount of time manually collecting data and putting it into Excel to double check Realtor.ca. The company is the largest multiple listing system used in New Brunswick and holding that status comes with some form of implicit trust that the public holds for information it publishes. In this section, I will lay out sections and guidelines from the Competition Act and why I believe that Realtor.ca is violating the Act.
**Competition Act**
For the below, I will be using the most updated version of the Competition Act R.S.C., 1985, c. C-34, last amended on December 15, 2023 (https://laws.justice.gc.ca/eng/acts/C-34/page-1.html) and the “Application of the Competition Act to Representations on the Internet” published by Competition Bureau Canada (https://publications.gc.ca/collections/collection\_2010/ic/Iu54-1-2009-eng.pdf)
*Section 2.2, Paragraph 4 of the Application of the Competition Act to Representations on the Internet*
“Businesses should not assume that consumers read an entire Web site, just as they do not read every word on a printed page. Accordingly, information required to be communicated to consumers to ensure that a representation does not create a false or misleading impression should be presented in such a fashion as to make it noticeable and likely to be read.”
Explanation: Section 2.2 applies to the average house price and accompanying figures (Figure 2). Realtor.ca shows the average house price in text and graph form but does not disclose that these are house and land price average if my calculations are accurate.
*Section 4.1, Paragraph 1 of the Application of the Competition Act to Representations on the Internet*
“If qualifying information is necessary to prevent a representation from being false or misleading when read on its own, businesses should present that information clearly and conspicuously. Businesses frequently use disclaimers, often signalled by an asterisk, to qualify the general impression of their principal representation when promoting their products or services. As mentioned earlier, the general impression conveyed by the representation, as well as its literal meaning, are taken into account in determining whether a representation is false or misleading.”
Explanation: Section 4.1 applies to Realtor.ca house price indices and other methodologies. A disclaimer in this case would be located within the same small paragraph above the figures. Instead, they use their own house price index to obfuscate their methodologies (Figure 2). Another option they give is below the graphs as “Ask a realtor for more detailed information” which creates an additional barrier to the users right under the Competition Act. Specifically, the “to qualify the general impression of their principal representation when promoting their products or services.” The “ask a realtor” hyperlink brings you to an additional page where you can find their realtors in your area. This is incentivizing the user to use their services over others to access more information. Realtor.ca has a majority market share in New Brunswick which further reinforces their monopolistic practices over real estate that hurts consumers.
*Section 4.1.3, Paragraph 1 of the Application of the Competition Act to Representations on the Internet*
“Businesses may effectively draw attention to a disclaimer so that it is more likely to be read by using attention-grabbing tools to display the disclaimer. In doing so, businesses must be careful not to design attention-grabbing tools in other parts of the advertisement in such a way that they distract the consumer’s attention away from the disclaimer, making it unlikely that the consumer will notice the disclaimer or recognize its importance.”
Explanation: Section 4.1.3 is further evidence of obfuscation and misrepresentation of their graphical aids and calculations. Similar to section 2.2 in the Application of the Competition Act to Representations on the Internet, Realtor.ca placed those figures at the bottom of the first page of listings to draw the user’s attention to their interpretation of data.
*Section 52 (1) of the Competition Act: False or misleading representations*
“No person shall, for the purpose of promoting, directly or indirectly, the supply or use of a product or for the purpose of promoting, directly or indirectly, any business interest, by any means whatever, knowingly or recklessly make a representation to the public that is false or misleading in a material respect.”
Explanation: Section 52 (1) is the main argument for this report. I believe that Realtor.ca knowingly or recklessly misrepresented the average house price in Fredericton using deceptive graphical aids and created a home price index to further obfuscate the methodology.
I am not a lawyer, so I could be misinterpreting the sections of the Competition Act. I believe Realtor.ca has reached the threshold of violating the Competition Act since Section 52.1.1 states:
“For greater certainty, in establishing that subsection (1) was contravened, it is not necessary to prove that (a) any person was deceived or misled; (b) any member of the public to whom the representation was made was within Canada; or (c) the representation was made in a place to which the public had access.”
This amendment to the Competition Act removed the threshold of proving that an individual or the public were deceived or misled. I believe that Realtor.ca has violated all three elements of section 52.1.1 ensuring that they have met the threshold of violating section 52.1 of the Competition Act.
**Conclusion**
I have given numerous caveats to my analysis, so it is possible I have come to the wrong conclusions given the lack of transparency in methodology and limited time frame. One thing I can conclude with certainty, is that Realtor.ca is misrepresenting market conditions through their figures displaying average house prices, pay gates to information, and methodology disclosures guised as a patented as a housing price index. I believe that Realtor.ca should make it clear to the user how their housing price index is calculated. Realtor.ca and the MLS system has succeeded in market capture and fights to keep this information pay gated to only people that benefit from these misleading claims. Regardless of their reasons, these monopolistic practices only benefit anyone under their system through the restriction of information to shape the way the public perceives the market conditions, a clear violation of the Competition Act and a disservice to the public.
There was a lot more I wanted to cover like if Statistics Canada (u/StatCanada) sourced their data from the MLS system and the broader implications of sourcing data that could be misrepresentation. Again, I could be wrong and would welcome any additional relevant information.
https://preview.redd.it/awfmkl0x6l0d1.png?width=1681&format=png&auto=webp&s=c9c4be8b6139c4f079ff343637b159b85e79cd3b
https://preview.redd.it/za540m0x6l0d1.png?width=3816&format=png&auto=webp&s=8c16fdcbc34795f46b38bdf502e1576fb43887dd
https://preview.redd.it/h5lz8p0x6l0d1.png?width=4166&format=png&auto=webp&s=3a76bdd71e64435fbaa38a768469d287b508946a
https://preview.redd.it/5qz74m0x6l0d1.png?width=3262&format=png&auto=webp&s=ab9363605bd6b31f324b5bb58f1fcc847f17a67b
submitted by PunnyHeals to newbrunswickcanada [link] [comments]


2024.05.15 14:47 PunnyHeals Misrepresentation of house price data by Realtor.ca

Misrepresentation of house price data by Realtor.ca
TL;DR:
  1. Realtor.ca claims the average sale price of a house in Fredericton, NB is $288,300. My own calculations point to the average being approximately $543,878.
  2. Realtor.ca most likely calculates it average house price using the average for houses and vacant land. My average for houses and land was $288,540, only a $240 difference, making this the most likely explanation.
  3. Realtor misrepresents graphs and averages through market capture, pay gating, and could be violating the Competition Act.
**Background**
I have been looking to buy a house for the past several years in the Fredericton area and have been checking the online listings regularly through Realtor.ca since it is the most common real estate listing website used in New Brunswick. What I liked about Realtor.ca was its ability to provide the average sell price for a house every month with graphs that showed the average sell price for a house in Fredericton for the past 12 months and 10 years. Looking for a house for an several years, I felt that I had a good idea of the market conditions and price ranges. My anecdotal evidence was that the average house price was much higher than Realtor.ca’s estimate of 288,300. I wondered if my anecdotal evidence could be supported by data.
The objective of this report is to collect list price data from all available listings within the Fredericton area. Once collected, I can take the average price and see if it matches the average price shown by Realtor.ca.
**Average/Median Methodology**
When you use Realtor.ca, you can filter results by the property type. There are six property type categories: Residential (single family home), condo/strata, vacant land, recreational, multi-family, and agriculture. For each of these property types, the asking price and address were copied into an Excel file. The data was collected on May 10, 2024, and included all listings within Fredericton; duplicate listings were removed.
Once all data was collected, the average and median for each property type was calculated (Table 1). I compared my calculated average to the Realtor.ca average to determine if my anecdotal evidence of thinking the average house price was higher than what Realtor.ca said was justified.
**Results**
There were 107 listings for residential houses (referred simply as “house” in this report), 245 listings for vacant land, 5 listings for recreational, 7 listings for multi-family, 2 listings for agriculture, and 10 listings for condos (Figure 1).
The average listing price was $543,878 for houses, $177,026 for land, $227,080 for recreation, $826,100 for multi-family, $829,450 for agriculture, and $317,410 for condos. The median listing price was $474,900 for houses, $64,900 for land, $229,900 for recreation, $799,000 for multi-family, $829,450 for agriculture, and $289,900 for condos (Table 1).
**Realtor.ca MLS System Average House Price Claim**
When you search for “houses for sale in Fredericton, NB”, you will see the top search results show Realtor.ca. This is not uncommon since Realtor.ca and its Multiple Listing Service (MLS) have the highest number of listings of any other online real estate listing service for the Fredericton, NB, area. Having most real estate listings concentrated on one system can provide users with a general idea of greater market conditions beyond individual listings, such as averages and trends for cities. Realtor.ca provides this data in the form of “Market Price (CAD)” price trends for the past 12 months, and price trends for the past 10 years (Figure 2). These figures are prominently displayed at the end of the first page of the Fredericton real estate listings (URL: https://www.realtor.ca/nb/fredericton/real-estate).
This leads us to the first claim by the Realtor.ca MLS system claim and our initial objective of this report.
Claim: The average market price in Fredericton sits at $288,300 as of May, 2024.
Analysis: When a user views these figures, it is a safe assumption that when a price is displayed, the user is inclined to believe that “Market Price (CAD)” is the average house price in Fredericton. This is further reinforced if the user reads the description above the figures which states:
“Use our home price trends to better gauge local market conditions and plan your next move. The graphs below show benchmark or average prices of homes sold in the area. Data generated by MLS® Systems and the MLS® Home Price Index (HPI) — Canada’s most advanced tool to gauge local home price levels and trends.”
This small paragraph specifically states, “The graphs below show benchmark or average prices of homes sold in the area.” Based off the graphs and their statement, we can safely interpret that Realtor.ca is explicitly saying that the average home price in Fredericton, NB, currently sits at $288,300; leaving no room for interpretation on how the data can be viewed. The reason I wanted to be explicitly clear on this thought process is that if you look back at the results section of this paper (Table 1) and see that the calculated average of all house listings was $543,878, it represents an 88.65% difference. A couple assumptions that could explain this difference are:
  1. The listings used in the analysis are only a snapshot in time and could not represent an accurate or precise representation of the monthly price average.
  2. Houses that were listed below the average could be selling more quickly, giving us a skewed data set that is not representative of all listings that have been posted.
  3. Realtor.ca gives the average sell price for houses in Fredericton and not the average listing price. There could be a large discrepancy between sell price and list price, resulting in my calculated average being inflated.
The three assumptions made above introduce bias into my conclusions, but given the magnitude of those differences, it could be reasonable to assume there might be an alternative reason causing these discrepancies.
Since there is such a large discrepancy in my calculated average and the average from Realtor.ca, I expanded my analysis to other categories. I combined my residential house data set with the other five property types to see if it would alter our initial average and how close it would come to the calculated Realtor.ca average (Table 2). Realtor.ca claims the average house price in Fredericton was $288,300, which seems to be closest to my calculated average for the combination of house and land listings. With the addition of these combinations, it suggests that Realtor.ca calculates average housing price using houses and land listings.
Realtor.ca MLS’s claim of the average house price in Fredericton, NB being $288,300 is a misrepresentation of the true market value and conditions. If a company were to calculate averages of an entire real estate market within an area, why would they only include house and land and not the other 4 categories?
**Misleading Representations by Realtor.ca**
The conclusions made from my analysis were made with plenty of explanations and assumptions. Given that the MLS system is a pay gated system, and their patented house price index algorithms are private, I feel it is reasonable to assume that my data is closer to true market prices. This leads us to the next question, if my data isn’t correct, why are the figures, calculations, and methodology misleading users on market conditions? The average user is not going to spend a significant amount of time manually collecting data and putting it into Excel to double check Realtor.ca. The company is the largest multiple listing system used in New Brunswick and holding that status comes with some form of implicit trust that the public holds for information it publishes. In this section, I will lay out sections and guidelines from the Competition Act and why I believe that Realtor.ca is violating the Act.
**Competition Act**
For the below, I will be using the most updated version of the Competition Act R.S.C., 1985, c. C-34, last amended on December 15, 2023 (https://laws.justice.gc.ca/eng/acts/C-34/page-1.html) and the “Application of the Competition Act to Representations on the Internet” published by Competition Bureau Canada (https://publications.gc.ca/collections/collection\_2010/ic/Iu54-1-2009-eng.pdf)
*Section 2.2, Paragraph 4 of the Application of the Competition Act to Representations on the Internet*
“Businesses should not assume that consumers read an entire Web site, just as they do not read every word on a printed page. Accordingly, information required to be communicated to consumers to ensure that a representation does not create a false or misleading impression should be presented in such a fashion as to make it noticeable and likely to be read.”
Explanation: Section 2.2 applies to the average house price and accompanying figures (Figure 2). Realtor.ca shows the average house price in text and graph form but does not disclose that these are house and land price average if my calculations are accurate.
*Section 4.1, Paragraph 1 of the Application of the Competition Act to Representations on the Internet*
“If qualifying information is necessary to prevent a representation from being false or misleading when read on its own, businesses should present that information clearly and conspicuously. Businesses frequently use disclaimers, often signalled by an asterisk, to qualify the general impression of their principal representation when promoting their products or services. As mentioned earlier, the general impression conveyed by the representation, as well as its literal meaning, are taken into account in determining whether a representation is false or misleading.”
Explanation: Section 4.1 applies to Realtor.ca house price indices and other methodologies. A disclaimer in this case would be located within the same small paragraph above the figures. Instead, they use their own house price index to obfuscate their methodologies (Figure 2). Another option they give is below the graphs as “Ask a realtor for more detailed information” which creates an additional barrier to the users right under the Competition Act. Specifically, the “to qualify the general impression of their principal representation when promoting their products or services.” The “ask a realtor” hyperlink brings you to an additional page where you can find their realtors in your area. This is incentivizing the user to use their services over others to access more information. Realtor.ca has a majority market share in New Brunswick which further reinforces their monopolistic practices over real estate that hurts consumers.
*Section 4.1.3, Paragraph 1 of the Application of the Competition Act to Representations on the Internet*
“Businesses may effectively draw attention to a disclaimer so that it is more likely to be read by using attention-grabbing tools to display the disclaimer. In doing so, businesses must be careful not to design attention-grabbing tools in other parts of the advertisement in such a way that they distract the consumer’s attention away from the disclaimer, making it unlikely that the consumer will notice the disclaimer or recognize its importance.”
Explanation: Section 4.1.3 is further evidence of obfuscation and misrepresentation of their graphical aids and calculations. Similar to section 2.2 in the Application of the Competition Act to Representations on the Internet, Realtor.ca placed those figures at the bottom of the first page of listings to draw the user’s attention to their interpretation of data.
*Section 52 (1) of the Competition Act: False or misleading representations*
“No person shall, for the purpose of promoting, directly or indirectly, the supply or use of a product or for the purpose of promoting, directly or indirectly, any business interest, by any means whatever, knowingly or recklessly make a representation to the public that is false or misleading in a material respect.”
Explanation: Section 52 (1) is the main argument for this report. I believe that Realtor.ca knowingly or recklessly misrepresented the average house price in Fredericton using deceptive graphical aids and created a home price index to further obfuscate the methodology.
I am not a lawyer, so I could be misinterpreting the sections of the Competition Act. I believe Realtor.ca has reached the threshold of violating the Competition Act since Section 52.1.1 states:
“For greater certainty, in establishing that subsection (1) was contravened, it is not necessary to prove that (a) any person was deceived or misled; (b) any member of the public to whom the representation was made was within Canada; or (c) the representation was made in a place to which the public had access.”
This amendment to the Competition Act removed the threshold of proving that an individual or the public were deceived or misled. I believe that Realtor.ca has violated all three elements of section 52.1.1 ensuring that they have met the threshold of violating section 52.1 of the Competition Act.
**Conclusion**
I have given numerous caveats to my analysis, so it is possible I have come to the wrong conclusions given the lack of transparency in methodology and limited time frame. One thing I can conclude with certainty, is that Realtor.ca is misrepresenting market conditions through their figures displaying average house prices, pay gates to information, and methodology disclosures guised as a patented as a housing price index. I believe that Realtor.ca should make it clear to the user how their housing price index is calculated. Realtor.ca and the MLS system has succeeded in market capture and fights to keep this information pay gated to only people that benefit from these misleading claims. Regardless of their reasons, these monopolistic practices only benefit anyone under their system through the restriction of information to shape the way the public perceives the market conditions, a clear violation of the Competition Act and a disservice to the public.
There was a lot more I wanted to cover like if Statistics Canada (u/StatCanada) sourced their data from the MLS system and the broader implications of sourcing data that could be misrepresentation. Again, I could be wrong and would welcome any additional relevant information.
https://preview.redd.it/rnsd41ym6l0d1.png?width=1681&format=png&auto=webp&s=51589de251bac87748c5ee7e9f0c24a2408fc4a0
https://preview.redd.it/apw3q2ym6l0d1.png?width=3816&format=png&auto=webp&s=3d0ce2c2c103032e343793ae147411338d107375
https://preview.redd.it/bi9hg2ym6l0d1.png?width=4166&format=png&auto=webp&s=911c5cfd60b58f658e0048552760efc2ec785561
https://preview.redd.it/ki25gaym6l0d1.png?width=3262&format=png&auto=webp&s=3dd741f4fead9c6782e24c8193062135238209d5
submitted by PunnyHeals to fredericton [link] [comments]


2024.05.13 14:04 Swing_Trader_Trading Large Cap Stalwarts Update 5/10/2024 – Up 25%

Large Cap Stalwarts Update 5/10/2024 – Up 25%
Blog Post:
https://preview.redd.it/z1l0pn8bo60d1.png?width=1024&format=png&auto=webp&s=b7758566f4391973e20d4331a12c91e2a375466c
The Model Portfolio Large Cap Stalwarts was updated this weekend. New Adds of 12 stocks were made and Removals of 6 stocks were made. All prices are as of 05/10/2024. A total of 20 stocks are in the Portfolio now.

Add – ACGL, BRO, CHRW, GOOGL, GRMN, HAS, HWM, MS, PHM, RCL, STLD, WAB

Remove – BLK, CAH, HST, NVR, TAP, VMC

The Model Portfolio Large Cap Stalwarts continues its outperformance over its benchmark RSP. It is now up 25% since it went live. This is 7% ahead of its benchmark RSP.
BackTest – The Backtest for this Portfolio outperformed the RSP benchmark in 12 out of the 19 years tested. It underperformed in years 2008, 2009, 2011, 2013, 2018 and 2020. It was largely even with the benchmark in 2010. When the Portfolio outperformed, it was usually by a significant margin. The best year was 2017 when it was up by 44%

See the Large Cap Stalwarts Portfolio Detail Here

YOUTUBE Swing Trader Trading Channel

https://preview.redd.it/idb0ypgfo60d1.png?width=1009&format=png&auto=webp&s=6e537553554fa79a90adb7fe046f6d0909c30bf7

Performance 12/31/2022 to 05/10/2024

https://preview.redd.it/nc9vup1ko60d1.png?width=571&format=png&auto=webp&s=ccf15d776c7227cd118cf4e668825f13e6a7bdf0
https://preview.redd.it/medn6gbpo60d1.png?width=1024&format=png&auto=webp&s=42aad22ae3f479b3f5ddd45d8d89acbb70634398

The Large Cap Stalwarts Model Portfolio continues to outperform its RSP benchmark by a healthy amount. This month, 12 stocks were Added and 6 stocks Removed. This net addition is normal for the 2nd month of a quarter since many new earnings reports are in the database giving the software fresh data to work with. The Portfolio continues to outperform its equal weight S&P 500 index, RSP.

All content on this site is for informational purposes only and does not constitute financial advice. Consult relevant financial professionals in your country of residence to get personalized advice before you make any trading or investing decisions. Disclaimer
submitted by Swing_Trader_Trading to Swing_Trader_Trading [link] [comments]


2024.05.09 15:36 Nestledrink Game Ready Driver 552.44 FAQ/Discussion

Game Ready Driver 552.44 has been released.

Article Here: https://www.nvidia.com/en-us/geforce/news/ghost-of-tsushima-geforce-game-ready-drive
Game Ready Driver Download Link: Link Here
New feature and fixes in driver 552.44:
Game Ready - This new Game Ready Driver provides the best gaming experience for the latest new games supporting DLSS 3 technology including Ghost of Tsushima: Director’s Cut. Further support for new titles leveraging NVIDIA DLSS technology includes the launch of Homeworld 3 which supports DLSS Super Resolution.
Fixed Gaming Bugs
Fixed General Bugs
Open Issues
Additional Open Issues from GeForce Forums
Driver Downloads and Tools
Driver Download Page: Nvidia Download Page
Latest Game Ready Driver: 552.44 WHQL
Latest Studio Driver: 552.22 WHQL
DDU Download: Source 1 or Source 2
DDU Guide: Guide Here
DDU/WagnardSoft Patreon: Link Here
Documentation: Game Ready Driver 552.44 Release Notes Studio Driver 552.22 Release Notes
NVIDIA Driver Forum for Feedback: Driver 552.44 Forum Link
Submit driver feedback directly to NVIDIA: Link Here
RodroG's Driver Benchmark: TBD
NVIDIA Discord Driver Feedback: Invite Link Here
Having Issues with your driver? Read here!
Before you start - Make sure you Submit Feedback for your Nvidia Driver Issue
There is only one real way for any of these problems to get solved, and that’s if the Driver Team at Nvidia knows what those problems are. So in order for them to know what’s going on it would be good for any users who are having problems with the drivers to Submit Feedback to Nvidia. A guide to the information that is needed to submit feedback can be found here.
Additionally, if you see someone having the same issue you are having in this thread, reply and mention you are having the same issue. The more people that are affected by a particular bug, the higher the priority that bug will receive from NVIDIA!!
Common Troubleshooting Steps
If it still crashes, we have a few other troubleshooting steps but this is fairly involved and you should not do it if you do not feel comfortable. Proceed below at your own risk:
If you are still having issue at this point, visit GeForce Forum for support or contact your manufacturer for RMA.
Common Questions
Bear in mind that people who have no issues tend to not post on Reddit or forums. Unless there is significant coverage about specific driver issue, chances are they are fine. Try it yourself and you can always DDU and reinstall old driver if needed.
Remember, driver codes are extremely complex and there are billions of different possible configurations. The software will not be perfect and there will be issues for some people. For a more comprehensive list of open issues, please take a look at the Release Notes. Again, I encourage folks who installed the driver to post their experience here... good or bad.
Did you know NVIDIA has a Developer Program with 150+ free SDKs, state-of-the-art Deep Learning courses, certification, and access to expert help. Sound interesting? Learn more here.
submitted by Nestledrink to nvidia [link] [comments]


2024.05.09 12:50 Main-Plan-4103 India's poverty

Poverty in India remains a major challenge despite overall reductions in the last several decades as its economy grows. According to an International Monetary Fund paper, extreme poverty, defined by the World Bank as living on US$1.9 or less in purchasing power parity (PPP) terms, in India was as low as 0.8% in 2019, and the country managed to keep it at that level in 2020 despite the unprecedented COVID-19 outbreak. According to the World Bank, India experienced a significant decline in the prevalence of extreme poverty from 22.5% in 2011 to 10.2% in 2019. A working paper of the bank said rural poverty declined from 26.3% in 2011 to 11.6% in 2019. The decline in urban areas was from 14.2% to 6.3% in the same period. The poverty level in rural and urban areas went down by 14.7 and 7.9 percentage points, respectively. According to United Nations Development Programme administrator Achim Steiner, India lifted 271 million people out of extreme poverty in a 10-year time period from 2005–2006 to 2015–2016. A 2020 study from the World Economic Forum found "Some 220 million Indians sustained on an expenditure level of less than Rs 32 / day—the poverty line for rural India—by the last headcount of the poor in India in 2013.
The World Bank has been revising its definition and benchmarks to measure poverty since 1990–1991, with a $0.2 per day income on purchasing power parity basis as the definition in use from 2005 to 2013. Some semi-economic and non-economic indices have also been proposed to measure poverty in India. For example, in order to determine whether a person is poor, the Multi-dimensional Poverty Index places a 33% weight on the number of years that person spent in school or engaged in education and a 6.25% weight on the financial condition of that person.
The different definitions and underlying small sample surveys used to determine poverty in India have resulted in widely varying estimates of poverty from the 1950s to 2010s. In 2019, the Indian government stated that 6.7% of its population is below its official poverty limit. Based on 2019's PPPs International Comparison Program, According to the United Nations Millennium Development Goals (MDG) programme, 80 million people out of 1.2 billion Indians, roughly equal to 6.7% of India's population, lived below the poverty line of $1.25 and 84% of Indians lived on less than $6.85 per day in 2019. According to the second edition of the Multidimensional Poverty Index (MPI) released by Niti Aayog, approximately 14.96% of India's population is considered to be in a state of multidimensional poverty. The National Multidimensional Poverty Index (MPI) assesses simultaneous deprivations in health, education, and standard of living, with each dimension carrying equal weight. These deprivations are measured using 12 indicators aligned with the Sustainable Development Goals (SDGs). On July 17, 2023, Niti Aayog reported a significant reduction in the proportion of poor people in the country, declining from 24.8% to 14.9% during the period from 2015–16 to 2019–21. This improvement was attributed to advancements in nutrition, years of schooling, sanitation, and the availability of subsidized cooking fuel. As per the report, approximately 135 million people in India were lifted out of multidimensional poverty between 2015–16 and 2019–21.
From the late 19th century through the early 20th century, under the British Raj, poverty in India intensified, peaking in the 1920s. Famines and diseases killed millions in multiple vicious cycles throughout the 19th and early 20th centuries. After India gained its independence in 1947, mass deaths from famines were prevented. Since 1991, rapid economic growth has led to a sharp reduction in extreme poverty in India. However, those above the poverty line live a fragile economic life. As per the methodology of the Suresh Tendulkar Committee report, the population below the poverty line in India was 354 million (29.6% of the population) in 2009–2010 and was 269 million (21.9% of the population) in 2011–2012. In 2014, the Rangarajan Committee said that the population below the poverty line was 454 million (38.2% of the population) in 2009–2010 and was 363 million (29.5% of the population) in 2011–2012. Deutsche Bank Research estimated that there are nearly 300 million people who are in the middle class. If these previous trends continue, India's share of world GDP will significantly increase from 7.3% in 2016 to 8.5% by 2020. In 2012, around 170 million people, or 12.4% of India's population, lived in poverty (defined as $1.90 (Rs 123.5)), an improvement from 29.8% of India's population in 2009. In their paper, economists Sandhya Krishnan and Neeraj Hatekar conclude that 600 million people, or more than half of India's population, belong to the middle class.t 1%.
submitted by Main-Plan-4103 to u/Main-Plan-4103 [link] [comments]


2024.05.08 17:26 PitifulCall9574 5. Should Satoshi Nakamoto come out?

5. Should Satoshi Nakamoto come out?

Leave a Comment / By WeishaZhu / February 4, 2023
——Answers to the concepts in the article “Invite Satoshi Nakamoto to Welcome the New World”
5.1 First half: Satoshi Nakamoto leaves – correct
  1. The real reason he left was for fairness
  2. The departure of Satoshi Nakamoto marks the end of the trial period of the Bitcoin system
  3. The departure of Satoshi Nakamoto prompted the formation of a community mechanism
  4. Satoshi Nakamoto demoted himself to become a miner after leaving
  5. Bitcoin production is coming to an end
5.2 Second Half: Satoshi Nakamoto Leads the Bitcoin Community – Correct
  1. Satoshi Nakamoto must abide by the rules of the community
  2. There are many non-technical problems to solve
  3. The prospect of Bitcoin’s natural evolution is, at most, gold
  4. Four-year halving is not enough to drive Bitcoin’s exponential rise
  5. Bitcoin standard as the world’s largest application, can promote the rapid rise of Bitcoin
  6. Satoshi Nakamoto fulfilled his promise to realize the dream of transforming the world
Satoshi Nakamoto should not have come out, which is to discuss the objections heard in this article. The reason is that the coin price will plummet when Satoshi Nakamoto comes out. If he came out, the Bitcoin system would be centralized. It is the mainstream view in the Bitcoin community. I opened an account on the Bitcoin forum and serialized this article, and after only a few issues, I was permanently banned. I thought that the Bitcoin forum had the spirit of Satoshi Nakamoto, and it was a free place. But they think Satoshi Nakamoto should not come out. Being banned reflects the negative results of centralization. The forum administrator said you do not need to give back your opinion because it cannot be resolved. It also reflects that the Bitcoin forum cannot keep pace with the times.
Centralization is both good and evil and cannot be generalized. Similarly, it is reasonable for anarchism for human beings to abandon them.
5.1 First half: Satoshi Nakamoto leaves – correct
Satoshi Nakamoto was left out of fear of legal persecution. Nevertheless, there were early signs of his departure, the most notable example being his natural body registered on the Bitcoin forum in October 2010.
1. The real reason he left was for fairness
At that time, the idea of decentralization was not as popular as today. People did not understand the mechanism of decentralized currency issuance as much as today, and people would suspect that the designer was fraudulent. In 2010, the system became stable and began to have a social impact. His departure will make the currency issuance mechanism fair because there is no master.
Dominance, Satoshi Nakamoto himself knows. He knows the ins and outs of everything. For example, Jameson Lopp said in the article, “Is Satoshi Nakamoto a Greedy Miner?” He studied the Patoshi model and found that until October 2009, Satoshi Nakamoto dominated Bitcoin’s computing power and owned more than half of the network hash rate. Therefore, this means that he also can modify the ledger. If someone has this power, Bitcoin is a meaningless game. Therefore, whether it is Jameson Lopp or Sergio Lerner who proposed the patoshi model, they all believe that this person is Satoshi Nakamoto. But, of course, today’s computing power is very scattered, and someone can’t have such a large computing power.
The fairness of Bitcoin’s currency issuance mechanism is reflected in the fact that Satoshi Nakamoto’s currency is also mined. In the early days, Satoshi Nakamoto designed the Patoshi mode, consciously allowing his machine to pause for 5 minutes within 10 minutes. In Jameson Lopp’s article, he judges that Patoshi’s work mode conforms to the schedule of Pacific Time in the United States. My article confirms that Satoshi Nakamoto lives in the Pacific Time Zone. Only Satoshi Nakamoto can do this kind of “stupid” thing, and the two things reflect each other.
Jameson Lopp’s confirmation that Patoshi’s work mode conforms to the schedule of American Pacific Time confirms Sergio’s results, which is Satoshi Nakamoto holds no less than 1.14 million bitcoins. Jameson Lopp also confirmed that Satoshi barely moved the bitcoins by September 2022.
2. The departure of Satoshi Nakamoto marks the end of the trial period of the Bitcoin system
The bitcoin system is a tool for mining bitcoins, just like a mined gold’s tool. A device is changed often, which means this tool is immature. Tools need to be repaired, but the structure cannot be changed. Satoshi Nakamoto’s departure marks that the Bitcoin system’s structure has not changed. The most comparable is Ethereum. The founder is still there, and he “fired” all the miners. Ethereum went from 1.0 to 2.0. As a result, the structure has changed. Today’s Ethereum and yesterday’s Ethereum have the same name. Bitcoin’s “0.1 is consistent”, sonorous and powerful.
3. The departure of Satoshi Nakamoto prompted the formation of a community mechanism
Satoshi Nakamoto is still here; decision-making will be centered on him. His departure is decision-making decentralization, which belongs to control decentralization. Bitcoin first formed a community mechanism in program development. The community is a consultation mechanism with internal checks and balances, which ensures that program development does not go astray. That is, it does not deviate from the Satoshi Nakamoto line.
4. Satoshi Nakamoto demoted himself to become a miner after leaving
The departure of Satoshi Nakamoto marks that the product can be offline. Anyone can use this product to make money; of course, Satoshi Nakamoto can. His mining continued until 2013. But this time, he is a Bitcoin civilian miner.
5. Bitcoin production is coming to an end
By the end of 2022, more than 19.6 million bitcoins will be mined. There are still 6.6% to be mined. This figure is nothing more than an annual inflation rate in the United States in 2022, and the inflation rate in the United States should be around 8%. From 2009 to 2022, the production process of Bitcoin and smooth production without accidents is the goal. This goal has been achieved. The current product development requires almost no development of the main network except for the second-layer technology. That means the first half is over. The rest of the job is aimed at selling things and promoting products.
Marketing must first be done by someone and the skills required are entirely different from those of production, so it should be market-oriented. It is the general law of product production and Marketing. Bitcoin is the product, and the Bitcoin system is the development tool. Satoshi Nakamoto is critical in the production and debugging phase of the Bitcoin system, and technical personnel are very important. In product operation and maintenance, the disappearance of Satoshi Nakamoto has more advantages than disadvantages, and technical personnel are still very important. But when all bitcoins are produced, the situation is reversed. At the end of the first half of Bitcoin, the importance of market personnel has risen, and the importance of users has increased. Bitcoin enters the second half, and the game’s rules are entirely different. The law that Satoshi Nakamoto disappeared no longer applies. Technical personnel are less critical than market personnel and users.
Who Are Bitcoin Marketers? There are very few products in modern times that don’t need marketing. Gold Doesn’t Need Marketing. Is It a Role Model for Bitcoin? Wrong! Wrong! Wrong!
5.2 Second Half: Satoshi Nakamoto Leads the Bitcoin Community – Correct
1. Satoshi Nakamoto must abide by the rules of the community
Now the Bitcoin community is a technical community, not a user community. No user community means that users cannot participate in the decision-making of the Bitcoin system. Bitcoin’s technical community proposal mechanism is complete. When Satoshi Nakamoto appeared, his proposal had to go through the process. If “0.1 consistent persists in”, the structure does not change, and very few places can be changed. Satoshi Nakamoto would not agree to make Bitcoin a platform like Ethereum. So when Satoshi Nakamoto came out, he had little impact on technology.
2. There are many non-technical problems to solve
I said in the previous article that the Bitcoin technical solution is perfect, but there are also regrets; for example, there is no foundation, which is not as good as Ethereum. There is no incentive for maintenance technicians. There are no rewards for older contributors, most notably Laszlo. “I’m poor and need a donation,” he tweeted. He is a figure in the history of Bitcoin, so poor that everyone sighs.
Another example is that Bitcoin is too expensive; who can decide to move the decimal point to expand Bitcoin? It also includes establishing the Bitcoin user community, the foundation, etc. These require Satoshi Nakamoto to come forward to have enough influence.
3. The prospect of Bitcoin’s natural evolution is, at most, gold
Gold is an ownerless system and it is also a natural evolution. However, it cannot prevent governments from abolishing the gold standard. Can Bitcoin’s natural development fortune be better than that of gold? What’s more, it can’t replace gold. There is no such thing as pie in the sky; projects that develop naturally and not well-marketed projects will finally be off the stage by competitors. This situation would have happened long ago if it weren’t for Satoshi Nakamoto’s foresight and the opponent’s super-weak level. Because Bitcoin is interdisciplinary, there are many experts in a particular field, but few people really stand at the height of Satoshi Nakamoto. Bitcoin is very lucky, and it is also the luck of humanity.
4. Four-year halving is not enough to drive Bitcoin’s exponential rise
Many believe that a four-year halving is enough to drive Bitcoin up. The higher the price, the greater the market value and the greater the driving force required to promote the rise. If Bitcoin rises exponentially in four years, it must have the energy to drive the exponential rise. Where does the power come from? The number of people and the amount of money will increase exponentially. The volatility of Bitcoin is not as stable as big blue chips such as Apple.
In contrast, Bitcoin is a paradise for speculative funds. The volatility shows that the “trading depth” of Bitcoin is not enough; that is, there is not enough money. The reason is that the attention of mainstream funds is not enough, only diversified safe-haven funds are paying attention. The number of bitcoin holders is less than half of that of Ethereum. How can bitcoin raise the price if these problems are not resolved? Dreaming of marrying a daughter-in-law-a beautiful idea. How to promote the exponential growth of Bitcoin? It is only possible if Satoshi Nakamoto leads the hard work of the Bitcoin community.
5. Bitcoin standard as the world’s largest application, can promote the rapid rise of Bitcoin
The rise of Bitcoin requires big applications, not payment, which is a paradise for stablecoins. The Bitcoin standard is large enough as an application, and it is also the most suitable application for Bitcoin, and any product is not easy to replace. It cannot be realized if Satoshi Nakamoto does not leave the mountain.
6. Satoshi Nakamoto fulfilled his promise to realize the dream of transforming the world
The realization of the Bitcoin standard is the largest application in the world, and it requires the appeal of Satoshi Nakamoto and the community’s joint efforts.
The above is why Satoshi Nakamoto came out of the mountain. If he does not go out, Bitcoin can be benchmarked against gold, and the market value of 50,000 to 10 trillion will be the top. But it is not Satoshi’s goal.
written in the back
Here is a brief answer to the user’s questions. To understand this article, you must read chapters 10-13 of “Invite Satoshi Nakamoto to welcome the new world” and the Q&A articles in the previous sections 1-4.
https://chainless.hk/2023/02/04/5-should-satoshi-nakamoto-come-out/
submitted by PitifulCall9574 to chainlesshk [link] [comments]


2024.05.06 00:14 MultiKoopa2 Access violation error 0xC0000005

Not sure what's going on here. The game ran completely flawlessly for me 5 months ago on this exact same computer, same setup, same hard drive installed to; I reset Cole's progress, and now, literally the INSTANT I set foot in his apartment I get this:
https://i.imgur.com/qo2JSqK.png
I have tried absolutely every solution I could find on Google from 2 years ago. lowered the in-game settings to the absolute lowest, it's barely using half my VRAM now; even lowered from 1080p to 720p
I've uninstalled, reinstalled, verified game files, moved to a different drive, run both Steam and Deathloop as Administrator, exception added to Antivirus for the whole Deathloop installation folder, just refuses to work.
I don't understand, I never encountered this issue a single time when I played through the game on this same computer half a year ago on Epic and on Steam
Would really appreciate any help
Crash summary : Error 0xC0000005 (Access Violation) occurred in thread 'worker3' at instruction location 0x00007FFAC11195AF. OS version : Windows 10 Pro 2009 Build 22631.3447 GPU name : NVIDIA GeForce GTX 1070 GPU driver : 55222 GPU device removed error code : 0x887A0006
Callstack Function(desc) Line Bytes File Module Address
 0 +0x795af msvcrt.dll 0x7ffac11195af 0 +0xbbf719 Deathloop.exe 0x7ff62efff719 0 +0x171e737 Deathloop.exe 0x7ff62fb5e737 0 +0x171a5f2 Deathloop.exe 0x7ff62fb5a5f2 0 +0x17144b8 Deathloop.exe 0x7ff62fb544b8 0 +0x1717412 Deathloop.exe 0x7ff62fb57412 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17165f3 Deathloop.exe 0x7ff62fb565f3 0 +0x17120c3 Deathloop.exe 0x7ff62fb520c3 0 +0x17117ad Deathloop.exe 0x7ff62fb517ad 0 +0x1796506 Deathloop.exe 0x7ff62fbd6506 0 +0x16bcf58 Deathloop.exe 0x7ff62fafcf58 0 +0xbd8fa7 Deathloop.exe 0x7ff62f018fa7 0 +0xbd9c27 Deathloop.exe 0x7ff62f019c27 0 +0xb84786 Deathloop.exe 0x7ff62efc4786 0 +0xc23f17 Deathloop.exe 0x7ff62f063f17 0 +0x1257d KERNEL32.DLL 0x7ffac186257d 0 +0x5aa48 ntdll.dll 0x7ffac332aa48 

Register Info

RAX: 0x0000000000000020 RBX: 0x0000000000000020 RCX: 0x0000000000000020 RDX: 0x0000006DF3DBDB00 RSI: 0x0000006DF3DBDB00 RDI: 0x00000000001FFFE0 R8: 0x0000000000000020 R9: 0x0000000000000020 R10: 0x00000000001FFFE0 R11: 0x0000019BC6A60158 R12: 0x0000000000000000 R13: 0x0000000000000006 R14: 0x0000000000000020 R15: 0x0000019BF0A7E7D8 EFL: 0x0000000000010246 RIP: 0x00007FFAC11195AF RSP: 0x0000006DF3DBD2C8 RBP: 0x0000006DF3DBD400

Exception Info

ExpCode: 0xC0000005 (Access Violation) ExpFlags: 0 ExpAddress: 0x00007FFAC11195AF
submitted by MultiKoopa2 to Deathloop [link] [comments]


2024.05.04 15:22 WinbuzzerMaria How to do a Hanging Indent in Microsoft Word

How to do a Hanging Indent in Microsoft Word

https://preview.redd.it/7sotv5cvueyc1.png?width=768&format=png&auto=webp&s=ad86b1d80d72498b7a80b34fb2e32ded8bfa1dc1
Table of Contents:
Creating a hanging indent in Microsoft Word is essential for formatting documents according to various style guides such as APA, MLA, and Chicago. This tutorial will guide you through the steps to create a hanging indent in Word, ensuring your documents meet academic or professional standards.
What is a hanging indent?
A hanging indent is where the first line of a paragraph is not indented, but subsequent lines are. This style is commonly used in bibliographies and reference lists, making it easier for readers to distinguish between entries.
submitted by WinbuzzerMaria to winbuzzer [link] [comments]


2024.05.02 18:51 Mother-Lingonberry-2 Anyone else with ExpressVPN installation issues on OCLP MBP Mid-2014 running Sonoma?

Posting this question on OCLP as I could not find a fix with ExpressVPN technical support. Recapping the issue: I have done a clean install of macOS Sonoma 14.4.1 with Open Core Legacy Patcher 1.4.3 on my MacBook Pro 15 inch mid-2014 (MacBookPro11,3). Before the clean install it was running macOS Big Sur with ExpressVPN installed and working fine. With Sonoma 14.4.1 I cannot go beyond the startup window and cannot even activate the VPN as the application shuts down when I click on "Connection", prompting an error window from the OS that it shut down unexpectedly. I have spent hours with ExpressVPN technical support to reset the keychains, reinstall past and beta versions to no avail. No firewall nor antivirus is activated.
On a side note, my OCLP MacBook Pro mid-2009 with Monterey 12.7.4 can run ExpressVPN without any issue. Which is leading to my assumption that this is a software issue and potentially Sonoma/OCLP compatibility.
Has anyone experience this issue with a patched Mac and eventually found a way to make it work?
submitted by Mother-Lingonberry-2 to OpenCoreLegacyPatcher [link] [comments]


2024.04.30 15:35 BigNegotiation5725 What to do next

What to do next
I got this yesterday, it's legion pro 5 with i7 and 4060. I have updated the laptop and graphic driver, uninstalled antivirus. What to next like benchmark and all.
submitted by BigNegotiation5725 to LenovoLegion [link] [comments]


2024.04.29 06:04 Deantasanto PC underperforming in League of Legends

https://www.microcenter.com/product/667282/acer-nitro-50-n50-640-ur11-gaming-pc
I mainly just play League of Legends, so I got a very cheap PC a while ago. To be quite frank, buying a cheap pre-built was a mistake. I did not realize at the time just how much of a pain in the ass the proprietary bullshit would be as well as general issues.
What I have changed about the PC: I have added another 8gb 3200 stick of their compatible proprietary memory since any other memory gets limited to 2166 Mhz. Even with their proprietary memory, the proprietary motherboard is not capable of XMP. I purchased one of of their proprietary cables to add two more SATA drives. I moved my RTX 3060 from my previous PC into this one.
Despite these changes, I've still been having problems with my PC's performance in League of Legends (LoL). LoL is weird for specs because it is a CPU-bound game made in 2009, meaning it only uses two threads.
I ran a benchmark with MSI Afterburner and found the following abysmal results:
 Average framerate : 154.0 FPS Minimum framerate : 79.2 FPS Maximum framerate : 164.7 FPS 1% low framerate : 90.5 FPS 0.1% low framerate : 48.3 FPS 
I looked up how other PC's with an RTX 3060 and i5 12400 performed, and found this: https://www.youtube.com/watch?v=krs77VRTewc&t=440s
The average and 1% low's were twice as high as my PC.
I ran Cinebench R23 to see how my CPU was performing, and I found that in the Single Core test I got a score of 1366, which is an under performance compared to expected average scores of this CPU of about 16%. I tested with Passmark's free trial performance test and got a score of Single-threaded score of 3182, which is an under performance of about 10%. I tested the multi core score on Cinebench for reference and got a score of 8387, which would be an under performance of about 33%. I tested the overall CPU mark test with Passmark and got a score of 15423, which is about an under performance of about 20%.
These tests consistently show that my CPU is underperforming, but I don't think that they were so bad as to get the results I'm getting in LoL.
I figured maybe my CPU is overheating, but according to HWiNFO64, in LoL I would never go above 80C, and even after benchmarking, my CPU's max temp was 89C with no thermal throttling.
What can I do to make my PC perform better short of swapping out the motherboard and memory?
submitted by Deantasanto to pchelp [link] [comments]


2024.04.28 19:01 Mackerel_Scales I read IWDA's financial report of 2023, so you don't have to.

You know how they say: "Don't invest in something you don't understand."
Me neither. So I'm working on understanding what I'm already investing in. Cue me taking on the silly idea this afternoon on reading the 1408 page (!) report of what IWDA did last year with the money I'm investing in them. [1]
Underneath a summary of my findings, so you don't have to do the same.
  1. Page 4. The fund is Irish, and most people working and subcontracting are, but there is a lot of London involved as well. The big names behind seem to be BlackRock, J.P. Morgan, State Street, Deloitte and Citibank. Funnily enough, I thought it was administered by BlackRock, but it's actually State Street doing the administration. The assets are also being held by State Street.
  2. Page 5. There was a change of chairs. The chair is now William McKechnie. Man has a linkedin [2] saying he's a professor at College of Europe in Bruges. [2]
  3. Page 9. They lowered the TER on a bunch of bond ETFS. Not IWDA unfortunately. They also launched some silly ETFs like equal weighted SP500 and Blockchain.
  4. Page 12. IWDA aka "iShares Core MSCI World UCITS ETF" is index tracking, non-replicating. It's not an article 8, or 9 fund, so cannot be called ESG.
  5. Page 14. An explanation of the relations between tracking difference, tracking error and TER (Total Expense Ratio). "The TER expresses the sum of all fees, operating costs and expenses, with the exception of direct trading costs, charged to each Fund’s assets as a percentage of the average Fund assets based on a twelve-month period ended 30 June 2023.", "Fund returns disclosed are the performance returns for the primary share class for each Fund, net of fees", "Realised tracking error is the annualised standard deviation of the difference in monthly returns between a fund and its benchmark index."
  6. Page 16. The numbers are for July 2022 to June 2023. The fund IWDA returned 18.58%. The benchmark is 18.51%. The tracking difference before TER was 0.27%, the tracking error after was 0.05%. So the IWDA outperformed the index. The reason for that is threefold.
    1. There is a net income difference. Page 18. "Comprising of withholding tax rate differential, tax reclaims and income timing differences between the Fund and the benchmark index." I think this refers to the bilateral tax agreement between Ireland and the US which allows dividends to only be taxed at 15%, while the index accounts for a 30% tax.
    2. Securities lending. More later.
    3. Investment techniques. Page 18. "Comprising of cash management, trading costs, currency hedging, futures held and sampling techniques." I guess this is saying that when you are not perfectly replicating, you might accidentally make a profit.
  7. They anticipate a tracking error up to 0.1% in the future. Notably, that is lower than the TER.
  8. Page 31. The board is attending all their meetings. Except Jessica Irschick. I can already see I am a lot like Jessica.
  9. Page 33. The board believes everybody in the board is paid fairly. "The maximum amount of remuneration payable to the Directors is determined by the Board and is set out in the prospectus of the Entity."
  10. Page 56. Lots of bladibla later, IWDA grew 8.29B USD. It paid 128M in taxes. It has 1B operating income and 94M operating expenses.
  11. Page 69 (nice), They started the year with 41B in assets. They added 8.2B in asset growth. 8.5B worth of shares were created, and 1.4B worth of shares were removed. The total number of assets in the fund is now 56.4B (I checked, that's 41B+8.2B+8.5B-1.4B).
  12. Page 82. Of that 56.4B, 192M is held in cash. 99.5% of the fund is held in assets. Like any good WSB autist, they also report spending 694k on margin cash.
  13. Page 96. IWDA had a VaR (Value at Risk) of 2.57% in 2023, down from 4.28% in 2022. Defined on page 95: "A 99% one day VaR means that the expectation is that 99% of the time over a one-day period each Fund will lose no more than this number in percentage terms." If you ever want to understand why we had a subprime financial crisis in 2009, this is why.
  14. Page 101. The assets are held at State Street Corp. It get's an S&P rating of A. Cue the famous blind lady scene from the movie "The Big Short"
  15. Page 103. On the 30th of June 2023, 5.3B worth of assets from IWDA were actually loaned out, for which they had received 5.9B worth of collateral. That's 9.4% of all IWDA assets, if my math is not off. It's a number which has almost doubled since 2022. The collateral is held in the following places: Bank of NY Europe, Euroclear or J.P. Morgan Chase Bank N.A.
  16. Page 124. Nearly 100% of the assets are valued at level 1, which means they are assets of which they are pretty sure of the stock price. Sometimes that can be a problem due to limited liquidity, but that's no issue for IWDA.
  17. Page 133. IWDA made 5M interest on its cash, received 1.05B worth of dividends. It notably also made 11M as income from lending securities. That's 0.02% on it's assets. That's quite low, in my opinion.
  18. Page 147. Details on the losses made. Derivatives are mentioned, which I'm surprised by. It looks like 0.4% of assets are held as financial derivatives.
  19. Page 160. 128M USD was donated to various governments around the world in the form of withholding taxes. This is after taking into account bilateral agreements.
  20. Page 167 and page 176 have receivables and payables, but this is just accounting stuff afaik.
  21. Page 180. "The authorised share capital of the Entity is 2 subscriber shares of a par value of EUR1.00 each and 500,000,000,000 participating shares of no par value." That's an odd way to structure the company? Maybe it's for shielding purposes?
  22. Page 188. If the company would have gone bankrupt on June 2023, the holders of IWDA would be entitled to 54.6B in assets, or 84.28 USD per share. If I look up the stock price of IWDA on June 30th 2023, it closed at 84.26 USD, so that's pretty close! It traded between 83.34 and 84.41 USD that day.
  23. Page 198. The subinvestment manager is "BlackRock Asset Management North Asia Limited and BlackRock Asset Management Deutschland AG"
  24. Page 201. "The total income earned from securities lending transactions is split between the relevant Fund and the Securities Lending Agent. The Funds which undertake securities lending transactions receive at least 62.5%, while the Securities Lending Agent receives up to 37.5% of such income". So whoever is organising the securities lending, get's to keep 37.5% of the profit for 0% of the risk. That's rich. I wonder who it is.
  25. Also page 201. The directors were paid 65 700 euro in fees. The auditors 313 000. That all seems cheap to me.
  26. Moving along 500 pages. From page 789 to page 811 is a list of all assets held in IWDA on the 30th of June 2023. It's also clearer what the derivatives are. They are Forward currency contracts and Euro Stoxx / SP500 futures. Those are totalling 0.03% of assets. 99.47% of assets are stock exchange listed securities. Notably, 0.51% of assets are "Other assets", I'm curious what is meant here.
  27. Page 1161. A list of companies which had the biggest change in number of assets.
  28. Page 1181. IWDA paid 3.7M in transaction costs. That's much cheaper than what I get at my broker.
  29. Page 1191-1195. If you want to know how the bankers get paid, this has the answer. For all the funds in iShares, the manager's staff got 220.4M. 118M fixed and 102.4M variable. 3940 people were paid here. The senior management got 21.6M. People with an impact on the risk profile got 30.8M.
  30. Page 1196. The securities lending agent is "BlackRock Advisors (UK) Limited"! Those are the guys making 37.5% on the securities lending. So via this loophole, BlackRock is making 37.5% / 62.5% x 11M = 6M per year on IWDA, its own fund. Bankers. ¯\_(ツ)_/¯
  31. Page 1204. The people borrowing shares from IWDA. BNP Paribas, Societe Generale, etc. Natixis is the only one I don't know. BNP is holding 1.39B in loan, with 1.545B in collateral
  32. Page 1231. 4.7B in collateral is in the form of equities, 1.1B in the form of fixed income assets, probably bonds.
  33. Page 1242. of this collateral, most of it is in Japan (484M), followed by Apple (184M). So it's not concentrated anywhere in particular.
My learnings:
As always, another pointer that the TER doesn't really make a difference. It's the index itself first, and then the tracking error you should look at.
I'm surprised how much securities lending is being done, for how small a difference in income. I can see why it's profitable for BlackRock to do it, as they seem to siphon some money there. All in all, it looks like it is all done for quite paltry sums of money.
Anyway. Here's 2 hours of my life I'm not getting back. I hope there's something useful for you here. Feel free to ask follow-up questions if you have any. I'm not a banker nor working in finance, but someone might have an answer.
[1] https://www.ishares.com/uk/individual/en/literature/annual-report/ishares-iii-plc-en-annual-report-2023.pdf
[2] https://www.linkedin.com/in/william-mckechnie-3420aa276
submitted by Mackerel_Scales to BEFire [link] [comments]


2024.04.23 20:25 MicasItalo Stutterring, audio dropouts, high Latency (wdf01000.sys, nvlddmkm.sys) with high gpu use (RTX 3070)

I formatted my computer last week, after changing motherboard, cpu and ram, putting w11 and since i did this I am having these problems (srry for bad english), on benchmarks the vga works perfectly.
Edit: High GPU Use even on iddle https://imgur.com/a/zEYwjTk
What i tried:
My old config > new config
Ryzen 5 3600 > Ryzen 5 7600 MB B450 Asus Prime > B650 Msi Tomahawk Corsair Vengeance DDR4 16GB > Corsair Vengeance DDR5 32GB RTX 3070 Asus KO
XPG Core Reactor 850w ( i tried with a core reactor 2 650w too)
Latencymon logs:
_________________________________________________________________________________________________________
CONCLUSION
_________________________________________________________________________________________________________
Your system appears to be having trouble handling real-time audio and other tasks. You are likely to experience buffer underruns appearing as drop outs, clicks or pops. One
or more DPC routines that belong to a driver running in your system appear to be executing for too long. Also one or more ISR routines that belong to a driver running in your
system appear to be executing for too long. One problem may be related to power management, disable CPU throttling settings in Control Panel and BIOS setup. Check for BIOS
updates.
LatencyMon has been analyzing your system for 1:08:41 (h:mm:ss) on all processors.
_________________________________________________________________________________________________________
SYSTEM INFORMATION
_________________________________________________________________________________________________________
Computer name: DESKTOP-J6FBLSU
OS version: Windows 10, 10.0, version 2009, build: 19045 (x64)
Hardware: MS-7D75, Micro-Star International Co., Ltd.
BIOS: 1.E0
CPU: AuthenticAMD AMD Ryzen 5 7600 6-Core Processor
Logical processors: 12
Processor groups: 1
Processor group size: 12
RAM: 31937 MB total
_________________________________________________________________________________________________________
CPU SPEED
_________________________________________________________________________________________________________
Reported CPU speed (WMI): 3801 MHz
Reported CPU speed (registry): 380 MHz
Note: reported execution times may be calculated based on a fixed reported CPU speed. Disable variable speed settings like Intel Speed Step and AMD Cool N Quiet in the BIOS
setup for more accurate results.
_________________________________________________________________________________________________________
MEASURED INTERRUPT TO USER PROCESS LATENCIES
_________________________________________________________________________________________________________
The interrupt to process latency reflects the measured interval that a usermode process needed to respond to a hardware request from the moment the interrupt service routine
started execution. This includes the scheduling and execution of a DPC routine, the signaling of an event and the waking up of a usermode thread from an idle wait state in
response to that event.
Highest measured interrupt to process latency (µs): 26208,90
Average measured interrupt to process latency (µs): 25,743358
Highest measured interrupt to DPC latency (µs): 26206,20
Average measured interrupt to DPC latency (µs): 22,821208
_________________________________________________________________________________________________________
REPORTED ISRs
_________________________________________________________________________________________________________
Interrupt service routines are routines installed by the OS and device drivers that execute in response to a hardware interrupt signal.
Highest ISR routine execution time (µs): 3975,490
Driver with highest ISR routine execution time: Wdf01000.sys - Tempo de Execução da Estrutura de Driver em Modo Kernel, Microsoft Corporation
Highest reported total ISR routine time (%): 0,000470
Driver with highest ISR total time: Wdf01000.sys - Tempo de Execução da Estrutura de Driver em Modo Kernel, Microsoft Corporation
Total time spent in ISRs (%) 0,000470
ISR count (execution time <250 µs): 174226
ISR count (execution time 250-500 µs): 0
ISR count (execution time 500-1000 µs): 60
ISR count (execution time 1000-2000 µs): 21
ISR count (execution time 2000-4000 µs): 2
ISR count (execution time >=4000 µs): 0
_________________________________________________________________________________________________________
REPORTED DPCs
_________________________________________________________________________________________________________
DPC routines are part of the interrupt servicing dispatch mechanism and disable the possibility for a process to utilize the CPU while it is interrupted until the DPC has
finished execution.
Highest DPC routine execution time (µs): 19403,870
Driver with highest DPC routine execution time: nvlddmkm.sys - NVIDIA Windows Kernel Mode Driver, Version 552.22 , NVIDIA Corporation
Highest reported total DPC routine time (%): 0,023401
Driver with highest DPC total execution time: nvlddmkm.sys - NVIDIA Windows Kernel Mode Driver, Version 552.22 , NVIDIA Corporation
Total time spent in DPCs (%) 0,046666
DPC count (execution time <250 µs): 2260723
DPC count (execution time 250-500 µs): 0
DPC count (execution time 500-10000 µs): 12433
DPC count (execution time 1000-2000 µs): 3293
DPC count (execution time 2000-4000 µs): 815
DPC count (execution time >=4000 µs): 195
_________________________________________________________________________________________________________
REPORTED HARD PAGEFAULTS
_________________________________________________________________________________________________________
Hard pagefaults are events that get triggered by making use of virtual memory that is not resident in RAM but backed by a memory mapped file on disk. The process of resolving
the hard pagefault requires reading in the memory from disk while the process is interrupted and blocked from execution.
NOTE: some processes were hit by hard pagefaults. If these were programs producing audio, they are likely to interrupt the audio stream resulting in dropouts, clicks and
pops. Check the Processes tab to see which programs were hit.
Process with highest pagefault count: msmpeng.exe
Total number of hard pagefaults 2619
Hard pagefault count of hardest hit process: 398
Number of processes hit: 49
_________________________________________________________________________________________________________
PER CPU DATA
_________________________________________________________________________________________________________
CPU 0 Interrupt cycle time (s): 141,982186
CPU 0 ISR highest execution time (µs): 1618,930
CPU 0 ISR total execution time (s): 0,147495
CPU 0 ISR count: 109217
CPU 0 DPC highest execution time (µs): 19403,870
CPU 0 DPC total execution time (s): 20,008865
CPU 0 DPC count: 391304
_________________________________________________________________________________________________________
CPU 1 Interrupt cycle time (s): 8,869767
CPU 1 ISR highest execution time (µs): 3975,490
CPU 1 ISR total execution time (s): 0,021170
CPU 1 ISR count: 14564
CPU 1 DPC highest execution time (µs): 12475,340
CPU 1 DPC total execution time (s): 0,505732
CPU 1 DPC count: 16370
_________________________________________________________________________________________________________
CPU 2 Interrupt cycle time (s): 13,880712
CPU 2 ISR highest execution time (µs): 1585,530
CPU 2 ISR total execution time (s): 0,009836
CPU 2 ISR count: 6589
CPU 2 DPC highest execution time (µs): 4219,350
CPU 2 DPC total execution time (s): 1,176704
CPU 2 DPC count: 888760
_________________________________________________________________________________________________________
CPU 3 Interrupt cycle time (s): 10,427785
CPU 3 ISR highest execution time (µs): 2,380
CPU 3 ISR total execution time (s): 0,000027
CPU 3 ISR count: 32
CPU 3 DPC highest execution time (µs): 7678,420
CPU 3 DPC total execution time (s): 0,372004
CPU 3 DPC count: 307761
_________________________________________________________________________________________________________
CPU 4 Interrupt cycle time (s): 7,507474
CPU 4 ISR highest execution time (µs): 0,70
CPU 4 ISR total execution time (s): 0,000002
CPU 4 ISR count: 8
CPU 4 DPC highest execution time (µs): 3053,610
CPU 4 DPC total execution time (s): 0,398781
CPU 4 DPC count: 284232
_________________________________________________________________________________________________________
CPU 5 Interrupt cycle time (s): 7,735267
CPU 5 ISR highest execution time (µs): 0,0
CPU 5 ISR total execution time (s): 0,0
CPU 5 ISR count: 0
CPU 5 DPC highest execution time (µs): 1866,410
CPU 5 DPC total execution time (s): 0,427755
CPU 5 DPC count: 291967
_________________________________________________________________________________________________________
CPU 6 Interrupt cycle time (s): 5,397439
CPU 6 ISR highest execution time (µs): 0,0
CPU 6 ISR total execution time (s): 0,0
CPU 6 ISR count: 0
CPU 6 DPC highest execution time (µs): 1960,50
CPU 6 DPC total execution time (s): 0,144666
CPU 6 DPC count: 87413
_________________________________________________________________________________________________________
CPU 7 Interrupt cycle time (s): 2,595652
CPU 7 ISR highest execution time (µs): 0,0
CPU 7 ISR total execution time (s): 0,0
CPU 7 ISR count: 0
CPU 7 DPC highest execution time (µs): 1642,910
CPU 7 DPC total execution time (s): 0,004569
CPU 7 DPC count: 838
_________________________________________________________________________________________________________
CPU 8 Interrupt cycle time (s): 2,801453
CPU 8 ISR highest execution time (µs): 1621,130
CPU 8 ISR total execution time (s): 0,012420
CPU 8 ISR count: 6080
CPU 8 DPC highest execution time (µs): 292,930
CPU 8 DPC total execution time (s): 0,004408
CPU 8 DPC count: 1602
_________________________________________________________________________________________________________
CPU 9 Interrupt cycle time (s): 2,588569
CPU 9 ISR highest execution time (µs): 1788,390
CPU 9 ISR total execution time (s): 0,009786
CPU 9 ISR count: 4220
CPU 9 DPC highest execution time (µs): 180,310
CPU 9 DPC total execution time (s): 0,002906
CPU 9 DPC count: 809
_________________________________________________________________________________________________________
CPU 10 Interrupt cycle time (s): 2,939392
CPU 10 ISR highest execution time (µs): 1462,280
CPU 10 ISR total execution time (s): 0,018650
CPU 10 ISR count: 19843
CPU 10 DPC highest execution time (µs): 1010,470
CPU 10 DPC total execution time (s): 0,022590
CPU 10 DPC count: 4804
_________________________________________________________________________________________________________
CPU 11 Interrupt cycle time (s): 2,659189
CPU 11 ISR highest execution time (µs): 2645,250
CPU 11 ISR total execution time (s): 0,013262
CPU 11 ISR count: 13756
CPU 11 DPC highest execution time (µs): 1632,410
CPU 11 DPC total execution time (s): 0,008563
CPU 11 DPC count: 1599
_________________________________________________________________________________________________________
submitted by MicasItalo to techsupport [link] [comments]


2024.04.23 15:24 therealbotaccount T14 gen 4 vs t14 gen 5

Hi. Today my t480 is dead because of the processor.
I am considering to buy between t14 gen 4 and t14 gen 5. I know the big advantage over gen 4 is the repairable thing. I normally purchase with customization and i compare the following bases on below specs
  1. CPU gen 4 i5-1335 vs gen 5 ultra 5 125H
  2. Memory 24gb
  3. 512 gb storage.
The different between these to are just ~150 usd. 150usd consider quite a lot for me but price should not be a concern for now.
The only think i can see the obvious different is when go to cpu benchmark and the following results score are
I5-1335 - 16268 Ultra 5 125H - 21572 I7-8550U - 5918 (old t480)
I am benchmark from old t480 to both gen4 and gen5, and the ratio of 'user experience' from i7 to gen 4 and i7 to gen 5 is 2.7x vs 3.6x respectively. This is different is not so much i guess.
Note: user experience is subjective. I only use existing data for justification purposes only.
The reason i am very particular about cpu is because i used to open multiple browser brand with multiple windows with multiple tab and i observe that these activity can consume a lot of cpu. It is even worst if antivirus/firewall is scanning the web and make the performance sluggish.
Having said that, i am considering to buy gen4 but what other things i should look into? I am not a power user, and i rarely customize the laptop after purchase.
submitted by therealbotaccount to thinkpad [link] [comments]


2024.04.22 19:23 Swing_Trader_Trading Small Cap Discoveries Portfolio Update – 04/19/2024 – Up over 48% for 2023-2024

Small Cap Discoveries Portfolio Update – 04/19/2024 – Up over 48% for 2023-2024
Blog Post:

https://preview.redd.it/zywpdf1ce2wc1.png?width=1024&format=png&auto=webp&s=b4b5e75a97b1a30322ec5ff2635583c33f76c9b7
The Model Portfolio Small Cap Discoveries was updated this weekend. New Adds of 9 stocks were made and Removals of 9 stocks were made. All prices are as of 04/19/2024. A total of 21 stocks are in the Portfolio now.
The Portfolio continues to outpace its benchmark, IJR, by a wide margin. The Portfolio is up over 48% for 2023-2024 and 38% ahead of its benchmark.
BackTest – The Backtest for this Portfolio outperformed the IJR benchmark in 15 out of the 19 years tested. It underperformed in years 2008, 2015, 2017 and 2019. When the Portfolio outperformed, it was usually by a wide margin. The best year was 2009 when it was up by 127%.

Add – AMTX, CSTE, DERM, HNST, ILPT, NOA, TTSH, UTI, VRA

Remove – CPLP, HCSG, MG, MTW, NOTV, PANL, SXC, ULBI, VIRC

Noteworthy are ORN (added 2/26/24) and HRTG (added 08/18/23). They are up 28% and 80% respectively. VIRC (added 9/22/23), was Removed by the software. It was up 60%.

See the Small Cap Discoveries Portfolio Detail Here

YOUTUBE Swing Trader Trading Channel

In the Six Model Portfolios, Mid Cap Flyers, Small Cap Discoveries, CANSLIM Growth, Large Cap Stalwarts, BI-Weekly Swings and Quant Alpha’s, the end-of-week stats are below.

https://preview.redd.it/g2miaqige2wc1.png?width=1009&format=png&auto=webp&s=884fd119da244b646ea1236b720ee194d3460a21

Performance 12/31/2022 to 04/19/2024


https://preview.redd.it/xoaq8owle2wc1.png?width=577&format=png&auto=webp&s=5561fe5e71f98ef08afb25ddaf9ff15c49cbe909

https://preview.redd.it/qlc8kg2oe2wc1.png?width=1024&format=png&auto=webp&s=65a03859eb91a937fb035ebee37b84b9cda77225

This update of the Small Cap Discoveries Model Portfolio is highlighted by its continuing outperformance over its benchmark. The Portfolio is up 48% since it went live at the beginning of 2023. The newly Added stocks and the Retained ones, now contain no tech stocks. This demonstrates that one does not need to be invested in tech stocks to make money in the stock market over time.

All content on this site is for informational purposes only and does not constitute financial advice. Consult relevant financial professionals in your country of residence to get personalised advice before you make any trading or investing decisions. Disclaimer
submitted by Swing_Trader_Trading to Swing_Trader_Trading [link] [comments]


2024.04.17 20:54 Grouchy-Serve-1203 Can someone grade this essay please and give specific feedback

The postmodern United States has faced the pressing obstacle of food insecurity, but the Supplemental Nutrition Assistance Program, or SNAP, has resolved this issue. Introduced in 1964, SNAP engineered a financial support system for low-income families through food-purchasing assistance. To properly gauge the effectiveness of SNAP, the program must be evaluated in four separate ways. First, it is significant to note fraud’s previously adverse effect on food stamps, yet more crucial to emphasize how SNAP eradicated fraud. Secondly, SNAP will be determined as a generally inclusive program that has faced occasional instances of restriction. Thirdly, SNAP will be assessed as a tool for eliminating poverty. Lastly, SNAP will receive praise for its effective nutritional competencies. Although at times SNAP has been troubled by fraud and exclusivity, the program has evolved to consistently provide a stable support system for economically disadvantaged citizens.
Background SNAP’s creation in 1964 was not the first food stamp program (FSP) in American history. In fact, the program hinged on prior success with food stamps. In 1939, the United States was still battling the Great Depression, and as a result, there was a food surplus and high unemployment (“Short History”). These conditions lead to the United States experimenting with a food stamp plan. The plan worked by giving out stamps to the needy who applied who then used the stamps by exchanging them for a food product. These foods were typically cheaper and in surplus. Still, a wide range of foods were available for stamp exchange, varying from eggs to pork to fruits (DPLA). This FSP lasted from only 1939 to 1943, but these four years made a substantial impact. By October 31, 1940, the program was in operation or had been announced for 218 areas with a combined population of approximately 39 million people (DPLA). Participation was widespread and plentiful. At any one point during these four years, food stamps helped support approximately 20 million citizens (“Short History”). The spring of 1943 marked the ending of a beginning for food stamps as “the conditions that brought the program into being--unmarketable food surpluses and widespread unemployment--no longer existed" (“Short History”).
Although the Food Stamp Program was gone, the desire for them remained ubiquitous among Americans. The next two decades were filled with studies, reports, and legislative proposals. Acting in the interests of the American People, senators such as Aiken, La Follette, Humphrey, Kefauver, and Symington sought to pass food stamp program legislation (“Short History”). Finally in 1961, 18 years after the end of the first program, President John F. Kennedy announced a pilot FSP. On May 29, 1961, Mr. and Mrs. Alderson Muncy of Paynesville, West Virginia, were the first food stamp recipients. Three years later, the pilot program had expanded to 380,000 participants (“Short History”). August 31, 1964, marked the day when the pilot FSP became a permanent reality. On a warm and sunny day foreshadowing the great progress to come, Lyndon B. Johnson signed onto the Supplemental Nutrition Assistance Program. Johnson declared that “as a permanent program, the food stamp plan will be one of our most valuable weapons for the war on poverty”, alluding to its purpose of enabling “low income families to increase their food expenditures, using their own dollars” (APP). Food stamps would eventually become America’s largest poverty assistance program, but not without a share of troubles along the way.
Fraud Although SNAP was hindered by fraud in its early years, EBTs helped mitigate fraud and eventually almost completely eradicate it. When SNAP was enacted into law, observers were concerned about possible fraud and abuse within the program. Throughout the late 20th century, these concerns became a reality, emerging as a large threat to the security and stability of SNAP (Tejani). In a 1981 New York Times article William Shaker, executive director of the National Tax Limitation Committee, noted ''food stamps score right at the top of the list in terms of misspent federal funds” (Roberts). The primary form of fraud was trafficking, which involved individuals and stores turning SNAP benefits into cash. In 1993, 3.8% of SNAP benefits were trafficked (Bartfeld 227). When the 24 billion dollar budget (Sugarman) of SNAP in 1993 is accounted for, the result is approximately 91 million dollars stripped from the needy. Faced with rising criticism about fraud in SNAP, the USDA was pressured and left in limbo about how to address the situation.
The Electronic Benefit Transfer (EBT) would be the answer to the problem fraud presented. An EBT is an “electronic system that allows a recipient to authorize transfer of their government benefits from a federal account to a retailer account to pay for products received” (“Short History”). Funds are delivered on a debit card. Its effectiveness lies in creating a record of each food stamp transaction. As a result, it is easier to identify violations and eradicate fraud. EBTs first appeared in 1984 but it was only until 2004 that they became mandated, and thus the standard (“Farmer’s Market”). The effectiveness of EBTs cannot be overstated. By 2010, the fraud rate had declined threefold from 1993 to a mere 1.3 percent. Judith Bartfeld, a University of Wisconsin professor and Food Security Research and Policy Specialist, attributes this decline to the shift from paper coupons to Electronic Benefit Transfers and their improved tracking of SNAP purchases (Bartfeld 227). 
Outside of EBTs, however, the Department of Agriculture has taken many other steps. The USDA enforces rules against trafficking and reducing overpayment rates through annual quality control audits. The result: the level of fraud in SNAP has become substantially less than in other assistance programs. (Bartfeld 227). In one such attempt to suppress fraud, households must contact the local SNAP office to report if their income increases dramatically. They also must reapply for SNAP periodically — typically every six to 12 months for most families and every 12 to 24 months for older adults and people with disabilities (CBBP 3). Household income is constantly being supervised to make sure that families opting into SNAP need support. If a family no longer needs food stamps, their income will show this, and the family will not receive support anymore. Instead, the benefits they had can be funneled back into the program and given to families in greater need of help. In general, fraud has consistently decreased throughout SNAP’s history, serving as a testament to the invention of the EBT and the proactive and perseverant work of the USDA.
Inclusion Although at times discriminatory, SNAP has generally been accessible to citizens regardless of identity. Even at its inception in 1964, SNAP “prohibited against discrimination on bases of race, religious creed, national origin, or political beliefs” (“Short History”). Notably, in the same year, the Civil Rights Act was passed, which was the “most sweeping civil rights legislation since Reconstruction” (“Civil”). The alignment of these two events positions SNAP as a trailblazer program. As one of the first programs under the Civil Rights Act, it emerged as a benchmark of success in the fight against racism.
By 1970, the FSP had expanded to US islands and territories and had also created a sole national eligibility standard (“Short History”). In this, SNAP addressed a former shortcoming of the program: divided and often unfair standards between different states and regions. The eligibility standards have changed slightly throughout the years, but in general, have specified that households must have a gross monthly income below around 130% of the federal poverty level to be eligible for SNAP (Han 467). Furthermore, the program is scaled such that households with the lowest incomes receive larger benefits than households closer to the poverty line (CBBP 3). SNAP executes two amazing ideas here. First, they make sure that not only people below the poverty line can obtain help. There is often a notion that the group just above a cutoff gets the shortest end of the stick, and is in a more unfortunate position than people receiving help. SNAP counters this by including households over the poverty line in eligibility standards. This has been proven effective, as a substantial number of households with income between 100 and 200% of the poverty line have taken advantage of SNAP’s support (Han 484). Then, to ensure true equality, the program is scaled such that the poorest households receive the most help while households who just meet the eligibility standards receive the least help in the program. The result is a balanced system where SNAP helps at different levels to provide an equal overall standing for all participants.
Still, SNAP has fallen short at times. For the first thirteen years in SNAP’s history, a cooking facility was required of households to receive benefits, but the citizens who didn’t have cooking facilities were often the poorest and couldn’t financially afford such amenities (“Short History”). As a result, the most disadvantaged citizens were left without assistance. Furthermore, budget cutbacks have sometimes forced SNAP to limit who receives access to support. In 1996, SNAP was forced to eliminate the eligibility of most legal immigrants. Luckily, six years later, eligibility was restored to qualified aliens who had been in the United States for at least five years (“Short History”). The trajectory of inclusion within SNAP is generally upward, as it continuously strives to create a fair program. In a sixty-year history, there will inevitably be points of weakness, yet SNAP has strongly combatted these moments by ensuring that all who need support can receive it.
Effect on Poverty Stellar participation, combined with effective regulations has led to SNAP having a profuse impact on poverty. Throughout its first decade, SNAP’s participation grew exponentially; this was a result of the program expanding geographically (“Short History”). SNAP started at 500,000 users in 1964 but by 1975 had rose to 19.2 million people. The USDA estimated that at this point approximately 40 million Americans were eligible to receive support (Claffey, Stucker 44). Throughout the 1980s and 1990s participation remained at a generally constant ratio of eligible citizens participating. Factors influencing the participation rate throughout these decades did not have to do with the effectiveness of SNAP as a program. For example, after the 1980 and 1982 recessions, the macroeconomy increased, causing fewer people to be eligible for SNAP (Bartfeld 12). As a result, participation went down but this was not of any relation to SNAP’s effectiveness; the percentage of those eligible to participating remained relatively unchanged. From 1990 to 1994 there was a surge in participation as a result of growth in the Aid to Families with Dependent Children (AFDC) program. AFDC participants were automatically eligible for food stamps, thus causing an increase in food stamp participation. The 1980s and 1990s had a substantial number of eligible households participating in SNAP, which was already impressive, but the program would soon reach new heights.
Onwards from the 2000s, there has been a jump in SNAP beneficiaries and the implementation of EBTs is entirely responsible for this. EBTs were introduced in the early 2000s, and between 2000 to 2014 the number of recipients exploded 272% (Han 467) from 17.2 million participants out of 47 million eligible to 46.7 million out of 65 million (“Yearly”). In 2000, 37% of eligible recipients took advantage of the program, but only fourteen years later this number almost doubled to 72%. This jump is an outlier, far exceeding any other period of growth in SNAP history. This suggests that EBTs are solely responsible for the ascent in participation. The appeal of EBTs lies in how they conceal that a consumer is on SNAP. Before EBTs, people were less willing to put their pride aside, use a brightly colored stamp, and reap the benefits of SNAP. But when this stamp turned into a regular debit card, more households joined, and participation skyrocketed.
Greenmarket, a network of farmers markets in New York City, is a compelling case study that measures EBTs impact. Greenmarket was given a grant to evaluate the implementation of EBT in their markets. The result was astonishing. In 2008 they received $101,000 in SNAP sales compared to their $251,000 in 2009 (“Farmer’s Market). As a result of implementing EBT, sales boosted 2.5 times. The boosted sales at Greenmarket reflect the capabilities of EBT to attract SNAP users.
Along with high participation rates, SNAP has effectively reduced poverty for users. Although SNAP was enacted as only a minor part of a social safety net, it has grown to become a key structure in the social support system (Bartfeld 225). In a typical year, SNAP lowers the poverty rate by 5 to 10 percent. Furthermore, SNAP lowers the deep poverty percentage-the fraction of people living on less than half of the poverty line- by 10 to 20 percent in a year (Bartfeld 8). Out of any socioeconomic group, people in deep poverty are most in need of help. SNAP redefines finances for many of these users, giving the poorest in society much-needed support. When it comes to the idea of a support system, truly no other program has as much impact on citizens as SNAP. Firstly, SNAP purchases are not taxed. This gives families saving each penny a little more comfort and also ensures that users have to give absolutely nothing on each SNAP purchase. Secondly, because SNAP helps households pay for food, it gives families more cushion in other expenditures. SNAP reduces the risk that households will fall behind on their nonfood essential expenses. This includes housing by 7.2%, utilities by 15.3%, and risk of medical hardship by 8.5% (Guttierez and Schaefer 774). And when it comes to food, SNAP participants receive remarkable support. SNAP participants are between 14.9 and 36.6 percent less likely to be food insecure than nonparticipants (Gunderson 97). At its minimum, SNAP reduces rates of food insecurity by 15%. Throughout SNAP’s history, many have criticized the program, noting that it is too costly or goes to those who don’t need it. The facts say otherwise. SNAP costs 0.5% of the US GDP, but the SNAP’s impacts far outweigh that cost. For the 0.5% in GDP, America gets a 16 percent reduction in poverty. Furthermore, the poverty gap, which measures the depth of poverty, is cut by 41% (Bartfeld 76). With SNAP, America gets an effective program that cuts poverty metrics by double-digit percentages. It is the most costly support program in America, but the impacts of SNAP justify the cost tenfold. 
Nutrition There is a common misconception that SNAP is unhealthy and causes obesity; if anything, SNAP reduces obesity rates. SNAP benefits cannot be used to buy alcohol, cigarettes, or tobacco (Dixon), three items linked with obesity. SNAP also gives eligibility to many whole foods in an attempt to create a balanced diet. However, many users, burdened with limited finances, will understandably choose to buy processed food as it contains more calories at a cheaper price. Still, the presence of food causes less obesity than no food at all. Studies have found that food-insecure children are at least twice as likely to report being in poor health compared to food-secure children (Gunderson and Ziliak). Charles Baum of Middle Tennessee State University concluded from his studies that “the Food Stamp Program has probably had virtually an immeasurably small impact on the growing prevalence of obesity” (Baum 645). Craig Gunderson, a professor at the University of Illinois with a research emphasis on SNAP, takes this hypothesis one step further. Gunderson first notes the inverse relationship between obesity rates and an increase in income. Given this, he argues “it appears unlikely that receiving more money to purchase food would lead to higher rates of obesity. Thus we would anticipate that SNAP recipients are less likely to be obese than eligible nonrecipients.” (Gunderson 99). The overt intention of SNAP is to provide support for underprivileged households, not to increase nutritional output within this group of society. Still, SNAP is successful in keeping obesity from plaguing its program.
Conclusion SNAP is a progressive program that has redefined the social support system as it is known. Every year, millions of Americans have found solace in their card or stamp when paying for food. Their overwhelming faith in SNAP proves the program’s value. This faith also reflects the improvements SNAP has made to become a champion for underprivileged citizens. Years of legislation and additions such as the EBT have left the program better than it once was. When Lyndon B. Johnson enacted SNAP into law, he declared the program as a “weapon on poverty.” Sixty years, and a whole lot of stamps later, Johnson remains correct.
submitted by Grouchy-Serve-1203 to APUSH [link] [comments]


2024.04.16 15:32 Nestledrink Game Ready & Studio Driver 552.22 FAQ/Discussion

Game Ready & Studio Driver 552.22 has been released.

Article Here: https://www.nvidia.com/en-us/geforce/news/manor-lords-geforce-game-ready-drive
Game Ready Driver Download Link: Link Here
Studio Driver Download Link: Link Here
New feature and fixes in driver 552.22:
Game Ready - This new Game Ready Driver provides the best gaming experience for the latest new games supporting DLSS technology including Manor Lords which features support for DLSS Super Resolution. Further support for new titles includes the launch of No Rest for the Wicked.
Applications - The April NVIDIA Studio Driver provides optimal support for the latest new creative applications and updates including Blackmagic Design’s DaVinci Resolve version 19 and Trimble’s SketchUp 2024.
Fixed Gaming Bugs
Fixed Applications Issue
Fixed General Bugs
Open Issues
Additional Open Issues from GeForce Forums
Notes: This is not new. Manuel from Nvidia has been tracking any additional driver issues in their forum post separate from release notes. Started doing this recently and will continue moving forward
Driver Downloads and Tools
Driver Download Page: Nvidia Download Page
Latest Game Ready Driver: 552.22 WHQL
Latest Studio Driver: 552.22 WHQL
DDU Download: Source 1 or Source 2
DDU Guide: Guide Here
DDU/WagnardSoft Patreon: Link Here
Documentation: Game Ready Driver 552.22 Release Notes Studio Driver 552.22 Release Notes
NVIDIA Driver Forum for Feedback: Driver 552.22 Forum Link
Submit driver feedback directly to NVIDIA: Link Here
RodroG's Driver Benchmark: TBD
NVIDIA Discord Driver Feedback: Invite Link Here
Having Issues with your driver? Read here!
Before you start - Make sure you Submit Feedback for your Nvidia Driver Issue
There is only one real way for any of these problems to get solved, and that’s if the Driver Team at Nvidia knows what those problems are. So in order for them to know what’s going on it would be good for any users who are having problems with the drivers to Submit Feedback to Nvidia. A guide to the information that is needed to submit feedback can be found here.
Additionally, if you see someone having the same issue you are having in this thread, reply and mention you are having the same issue. The more people that are affected by a particular bug, the higher the priority that bug will receive from NVIDIA!!
Common Troubleshooting Steps
If it still crashes, we have a few other troubleshooting steps but this is fairly involved and you should not do it if you do not feel comfortable. Proceed below at your own risk:
If you are still having issue at this point, visit GeForce Forum for support or contact your manufacturer for RMA.
Common Questions
Bear in mind that people who have no issues tend to not post on Reddit or forums. Unless there is significant coverage about specific driver issue, chances are they are fine. Try it yourself and you can always DDU and reinstall old driver if needed.
Remember, driver codes are extremely complex and there are billions of different possible configurations. The software will not be perfect and there will be issues for some people. For a more comprehensive list of open issues, please take a look at the Release Notes. Again, I encourage folks who installed the driver to post their experience here... good or bad.
Did you know NVIDIA has a Developer Program with 150+ free SDKs, state-of-the-art Deep Learning courses, certification, and access to expert help. Sound interesting? Learn more here.
submitted by Nestledrink to nvidia [link] [comments]


2024.04.14 22:46 Market_Mages Market Mages Weekly Newsletter: Week of April 7th - How Ongoing Geopolitical Unrest Affects Markets

Market Mages Weekly Newsletter: Week of April 7th - How Ongoing Geopolitical Unrest Affects Markets
While the economy, earnings, and interest rates are very important drivers of the markets, so are politics and geopolitics, especially when they reach the magnitude of where we are today.
Saturday, Oct. 7th of last year was when the news started to break about the invasion of Hamas from Gaza into Southern Israel. This is one of those seminal events that will impact the world for many years. It is now over 7-months since this horrific event took place and it is not getting any better, in fact, it is getting worse, and it will more likely get much worse.
It is almost certain that Israel will eventually go in and finish the job against Hamas and then turn their attention to Hezbollah in Southern Lebanon. While it would seem that Hamas and Hezbollah are no match for the IDF, it is Iran that stands behind them and uses them as proxies in their war against Israel.
Keep in mind also, that Russia and China have formed alliances with Iran, and they are using Iran as a proxy in their cold war against America. That is why the market does not take these developments lightly. It is showing record gold prices and surging oil prices.
Both of these asset classes are good indicators of tension in the middle-east, and it is obvious from their recent price action that tensions are running very high at the current time. Oil is up over 28% since last December and as a result, prices at the pump have gone up right along with rising oil prices.
https://preview.redd.it/3nbc6vvfbiuc1.png?width=975&format=png&auto=webp&s=7bd9dd78b2aee26bc487af18e85aef71eec568d1
This is also contributing to sticky inflation that came in at a hotter-then-expected 3.5% annual rate (CPI) this past Wednesday. As a result, the DJIA was down 422 points on that day and is now threatening to roll over from a Distribution topping trend into a Markdown correction.
https://preview.redd.it/20ofki5ibiuc1.png?width=975&format=png&auto=webp&s=6daecd7d4d0da6372af3f7377ab9227182285cda
And while the market is drifting lower, gold is continuing higher. In fact, gold is hitting new, all-time highs after returning just an average of 5.5% per year over the last ten years (while the S&P 500 has averaged 17.8% per year).
https://preview.redd.it/uitidi1kbiuc1.png?width=975&format=png&auto=webp&s=aaafd85be156099e5c19eb435ad949189741f9aa
In the past, gold has done well during financial crisis (it was up 24.9% during the financial crisis of 2007-2009 while the S&P was down -54.5%), big drops in the U.S. dollar (we have not had any lately), and central bank buying during times of world crisis. This is what is now taking place.
The PPI numbers came in lighter than expected on Thursday and we had a nice rebound in the market with the AI stocks on fire.
But, then on Friday, we had another big sell-off in the market.

Major News Headlines For The Week:

Monday 4/8/24: A very dull day for the market on Monday. The DJIA is now 900 points below its recent high. I guess most traders were watching the eclipse.
The biggest S&P stocks have to do the heavy earnings lifting (again) - Goldman https://seekingalpha.com/news/4087763-the-10-biggest-sp-stocks-have-to-do-the-heavy-earnings-lifting-again
"The 10 largest S&P 500 (NYSEARCA:SPY) (IVV) (VOO) companies are expected to grow sales by 15% yeayear and post EPS growth of 32%," strategist David Kostin wrote in a note. "In contrast, the remaining 490 firms are expected to grow topline by just 2% yeayear and deliver EPS growth of -4%."
Jamie Dimon says rates could spike to 8%, AI akin to the printing press
https://seekingalpha.com/news/4087824-jamie-dimon-says-rates-could-spike-to-8-ai-akin-to-the-printing-press?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“JPMorgan Chase (NYSE:JPM) CEO Jamie Dimon says the markets should be prepared for more turbulence than what is currently priced in. In his annual letter to shareholders, Dimon cast doubt on the confidence that the U.S. economy will see a soft landing - and the corresponding equity valuation. "Equity values, by most measures, are at the high end of the valuation range, and credit spreads are extremely tight," Dimon wrote. "These markets seem to be pricing in at a 70% to 80% chance of a soft landing — modest growth along with declining inflation and interest rates."
GE Aerospace boosts dividend by 250% to $0.28
https://seekingalpha.com/news/4087729-ge-aerospace-declares-missing-dividend
“GE Aerospace (NYSE:GE) declares $0.28/share quarterly dividend, 250% increase from prior dividend of $0.08”
Tuesday 4/9/24: Another dull day for the market as it awaits the big CPI report on Wednesday.
Germany's inflation rate drops to 2.2% in March, lowest since May 2021
https://seekingalpha.com/news/4088133-small-business-optimism-at-11-year-low-as-bulls-keep-running?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Stock bulls are confident that AI enthusiasm will overcome fewer rate cuts and sticky inflation. But inflation continues to bite small business owners, who are the most pessimistic in more than 11 years. The NFIB March small business optimism index fell for the third-straight month to 88.5. That's the lowest level since December 2012 and the 27th month in a row it's been below the historical average of 98.”
Wells Fargo ups its year-end S&P 500 target to 5,535
https://seekingalpha.com/news/4087897-wells-fargo-ups-its-year-end-sp-target-to-5535?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Wells Fargo said Monday that it has raised its year-end S&P 500 (SP500) target to 5,535, which would give the benchmark average upside of roughly 6.4% from current levels. “We raised our year-end SPX target to a Street-high 5535 (20.5x 2025E's $270) from 4625, implying 6.4% upside. The 20.5x multiple reflects a 4.88% discount rate (4% 10yr UST + 88bp IG credit spread). EPS focus shifts to 2025 from 2024,” Wells Fargo said in an investor note.”
Global oil market likely 'extremely tight' in this year's H2, Citadel says
https://seekingalpha.com/news/4088066-global-oil-market-likely-extremely-tight-in-this-years-h2-citadel-says?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Crude oil futures fell Monday but settled above the session's lowest levels, as Israel said it would remove some troops from southern Gaza to prepare for future operations. Israel and Hamas opened a fresh round of Gaza ceasefire talks on Sunday, but media reports differed on the amount of progress that was made.”
Wednesday 4/10/24: A big sell-off in the market on Wed. after a hotter than expected CPI report. The bond market gets clobbered as interest rates jump a whopping 19-basis points.
CPI inflation stays hot in March, rising 3.5% Y/Y
https://seekingalpha.com/news/4088527-cpi-inflation-stays-hot-in-march?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“The Consumer Price Index climbed 0.4% in March, the same increase as in February, and exceeding the 0.3% rise that economists expected. The indexes for shelter and gasoline both increased in March, contributing more than half of the monthly increase in the all-items index. The energy index jumped 1.1% over the month, and the food index edged up 0.1%. On a year-over-year basis, March's CPI climbed 3.5%, in line with expectations and accelerating from 3.2% in the prior month.”
Fitch cuts China's credit outlook to negative on public finance risks
https://seekingalpha.com/news/4088435-fitch-ratings-china-credit-outlook-negative-public-finance-risks?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Fitch Ratings has cut its outlook on China's long-term foreign debt to negative from stable, reflecting the growing risks to the country's public finances, but affirmed its A+ rating. The rating agency pointed to China's uncertain economic prospects as it transitions away from property-reliant growth to what the government views as a more sustainable growth model.”
Crude oil forecasts raised at UBS on strong demand
https://seekingalpha.com/news/4088465-ubs-lifts-brent-wti-oil-price-forecasts-by-5bbl-raises-demand-growth-estimates?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“UBS on Wednesday raised its oil price forecasts by $5/bbl, estimating Brent (CO1:COM) to hold a range of $85–$95/bbl and West Texas Intermediate (CL1:COM) in $80–$90/bbl range over the coming months. The bank also raised its demand growth estimates for the year, by 0.1mbpd to 1.5mbpd, while cutting its OPEC+ crude production projection for the second quarter of 2024.”
Thursday 4/11/24: A huge rally for the AI stocks on Thursday. Nvidia had a big day.
Producer price index cools in March from hot February print
https://seekingalpha.com/news/4088883-producer-price-index-cools-in-march-from-hot-february-print?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“The Producer Price Index rose 0.2% in March, less than the +0.3% expected and slower than the +0.6% pace in February, the U.S. Department of Labor said on Thursday. 35 The gauge increased 2.1% Y/Y, compared with +2.3% expected and +1.6% in February. Core PPI, which excludes food and energy, also increased 0.2% M/M vs. +0.2% expected and +0.3% prior. Compared with a year ago, core PPI rose 2.4%, topping the 2.3% expected and accelerating from the 2.0% pace in February.”
Initial jobless claims fall more than expected in past week
https://seekingalpha.com/news/4088882-initial-jobless-claims-fall-more-than-expected-in-past-week?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Initial jobless claims for the week ended April 6 fell by 11K to 211K vs. 216K consensus and 222K prior (revised from 221K). 36 The four-week moving average was 214,250, a decrease of 250 from the previous week's average of 214,500 (revised from 214,250). Continuing claims: 1.817M vs. 1.800M expected and 1.789M prior (revised from 1.791M).”
President Biden still expects rate cut before end of year
https://seekingalpha.com/news/4088697-president-biden-still-expects-rate-cut-before-end-of-year?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“President Joe Biden still sees the Federal Reserve cutting interest rates by the end of the year, he said on Wednesday, standing by his prediction laid out last month. Asked about March's hotter-than-expected consumer inflation report at a joint news conference with Japan's prime minister, Biden said the U.S. central bank might delay rate reductions by a month, and the White House is unsure about what exactly will happen.”
Friday 4/12/24: The market sells off on mid-east tension.
No link between weight loss drugs and suicidal thoughts, EU regulator concludes
https://seekingalpha.com/news/4089182-no-link-weight-loss-drugs-suicidal-thoughts?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Following a months-long review, the EU drug regulator, the European Medicines Agency (EMA), announced Friday that there is no link between suicidal thoughts and weight loss drugs such as semaglutide marketed by Novo Nordisk (NVO).”
Exxon CEOs 2024 total pay bumped up 3% to $36.9M; Chevron CEO pay rose 12%
https://seekingalpha.com/news/4089117-exxon-ceos-2024-pay-bumped-up-3-to-369m-chevron-ceo-pay-rose-12?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Exxon Mobil (NYSE:XOM) disclosed CEO Darren Woods' total compensation rose 2.8% to $36.9M last year, according to a new SEC filing Thursday. Woods' base salary was worth ~$1.9M, with the vast majority of his pay package based on variable factors such as a $4.8M bonus and stock-based awards. The CEO's total pay was 199x the median pay for an Exxon (XOM) worker, which improved 8% to $185,376.”
Drug shortages in U.S. at highest level on record since 2001
https://seekingalpha.com/news/4089131-drug-shortages-us-highest-level-on-record?source=content_type%3Areact%7Cfirst_level_url%3Amarket-news%7Csection_asset%3Amain
“Drug shortages in the U.S. continue to worsen, with 323 medications in short supply in Q1 2024, the highest on record since the American Society of Health-System Pharmacists and University of Utah Drug Information Service began tracking the data in 2001. "All drug classes are vulnerable to shortages," said ASHP CEO Paul Abramowitz. "Some of the most worrying shortages involve generic sterile injectable medications, including chemotherapy drugs and emergency medications."
Stock Market Weekly Scorecard:

https://preview.redd.it/d6nnes6vbiuc1.png?width=975&format=png&auto=webp&s=cca17a6bb40abc799bd9b42b72f942b26f432aa1
submitted by Market_Mages to Market_Mages [link] [comments]


2024.04.14 21:33 atopix -14 LUFS IS QUIET: A primer on all things loudness

If you are relatively new to making music then you'll probably be familiar with this story.
You stumbled your way around mixing something that sounds more or less like music (not before having watched countless youtube tutorials in which you learned many terrible rules of thumb). And at the end of this process you are left wondering: How loud should my music be in order to release it?
You want a number. WHAT'S THE NUMBER you cry at the sky in a Shakespearean pose while holding a human skull in your hand to accentuate the drama.
And I'm here to tell you that's the wrong question to ask, but by now you already looked up an answer to your question and you've been given a number: -14 LUFS.
You breathe a sigh of relief, you've been given a number in no uncertain terms. You know numbers, they are specific, there is no room for interpretation. Numbers are a warm safe blanket in which you can curl underneath of.
Mixing is much more complex and hard than you thought it would be, so you want ALL the numbers, all the settings being told to you right now so that your misery can end. You just wanted to make a stupid song and instead it feels like you are now sitting at a NASA control center staring at countless knobs and buttons and graphs and numbers that make little sense to you, and you get the feeling that if you screw this up the whole thing is going to be ruined. The stakes are high, you need the freaking numbers.
Yet now you submitted your -14 LUFS master to streaming platforms, ready to bask in all the glory of your first musical publication, and maybe you had the loudness normalization disabled, or you gave it a listen on Spotify's web player which has no support for loudness normalization. You are in shock: Compared to all the other pop hits your track is quiet AF. You panic.
You feel betrayed by the number, you thought the blanket was supposed to be safe. How could this be, even Spotify themselves recommend mastering to -14 LUFSi.

The cold truth

Here is the cold truth: -14 LUFS is quiet. Most commercial releases of rock, pop, hip hop, edm, are louder than that and they have been louder than that for over 20 years of digital audio, long before streaming platforms came into the picture.

The Examples

Let's start with some hand-picked examples from different eras, different genres, ordered by quietest to loudest.
LUFSi = LUFS integrated, meaning measured across the full lenght of the music, which is how streaming platforms measure the loudness of songs.
Billboard Year-End Charts Hot 100 Songs of 2023
  1. Last Night - Morgan Wallen = -8.2 LUFSi
  2. Flowers - Miley Cyrus = -7.2 LUFSi
  3. Kill Bill - SZA = -7.4 LUFSi
  4. Anti-Hero - Taylor Swift = -8.6 LUFSi
  5. Creepin' - Metro Boomin, The Weeknd & 21 Savage = -6.9 LUFSi
  6. Calm Down - Rema & Selena Gomez = -7.9 LUFSi
  7. Die For You - The Weeknd & Ariana Grande = -8.0 LUFSi
  8. Fast Car - Luke Combs = -8.6 LUFSi
  9. Snooze - SZA = -9.4 LUFSi
  10. I'm Good (Blue) - David Guetta & Bebe Rexha = -6.5 LUFSi

So are masters at -14 LUFSi or quieter BAD?

NO. There is nothing inherently good or bad about either quiet or loud, it all depends on what you are going for, how much you care about dynamics, what's generally expected of the kind of music you are working on and whether that matters to you at all.
For example, by far most of classical music is below -14 LUFSi. Because they care about dynamics more than anyone else. Classical music is the best example of the greatest dynamics in music ever. Dynamics are 100% baked into the composition and completely present in the performance as well.
Some examples:
Complete Mozart Trios (Trio of piano, violin and cello) Album • Daniel Barenboim, Kian Soltani & Michael Barenboim • 2019
Tracks range from -22.51 LUFSi to -17.22 LUFSi.
Beethoven: Symphony No. 9 in D Minor, Op. 125 "Choral" (Full symphony orchestra with sections of vocal soloists and choir) Album • Wiener Philharmoniker & Andris Nelsons • 2019
Tracks range from -28.74 LUFSi to -14.87 LUFSi.
Mozart: Symphonies Nos. 38-41 (Full symphony orchestra) Album • Scottish Chamber Orchestra & Sir Charles Mackerras • 2008
Tracks range from -22.22 LUFSi to -13.53 LUFSi.
On My New Piano (Solo piano) Album • Daniel Barenboim • 2016
Tracks range from -30.75 LUFSi to -19.66 LUFSi.

Loudness normalization is for THE LISTENER

Before loudness normalization was adopted, you would put together a playlist on your streaming platform (or prior to that on your iPod or computer with mp3s), and there would often be some variation in level from song to song, especially if you had some older songs mixed in with some more modern ones, those jumps in level could be somewhat annoying.
Here comes loudness normalization. Taking a standard from European broadcasting, streaming platforms settled on the LUFS unit to normalize all tracks in a playlist by default, so that there are no big jumps in level from song to song. That's it! That's the entire reason why streaming platforms adopted LUFS and why now LUFS are a thing for music.
LUFS were invented in 2011, long after digital audio was a reality since the 80s. And again, they weren't made for music but for TV broadcasts (so that the people making commercials wouldn't crank up their levels to stand out).
And here we are now with people obsessing over the right LUFS just to publish a few songs.

There are NO penalties

One of the biggest culprits in the obsession with LUFS, is a little website called "loudness penalty" (not even gonna link to it, that evil URL is banned from this sub), in which you can upload a song and it would turn it down in the same way the different platforms would.
An innocent, good natured idea by mastering engineer Ian Shepherd, which backfired completely by leading inexperienced people to start panicking about the potential negative implications of incurring into a penalty due to having a master louder than -14 LUFSi.
Nothing wrong happens to your loud master, the platforms DO NOT apply dynamic range reduction (ie: compression). THEY DO NOT CHANGE YOUR SIGNAL.
The only thing they do, is what we described above, they adjust volume (which again, changes nothing to the signal) for the listener's convenience.

Why does my mix sound QUIETER when normalized?

One very important aspect of this happens when comparing your amateur production, to a professional production, level-matched: all the shortcomings of your mix are exposed. Not just the mix, but your production, your recording, your arrangement, your performance.
It all adds up to something that is perceived as standing out over your mix.
The second important aspect is that there can be a big difference between trying to achieve loudness at the end of your mix, vs maximizing the loudness of your mix from the ground up.
Integrated LUFS is a fairly accurate way to measure perceived loudness, as in perceived by humans. I don't know if you've noticed, but human hearing is far from being an objective sound level meter. Like all our senses (and the senses of all living things), they have evolved to maximize the chances of our survival, not for scientific measurements.
LUFS are pretty good at getting close to how we humans perceive loudness, but it's not perfect. That means that two different tracks could be at the same integrated LUFS and one of them is perceived to be bit louder than the other. Things like distortion, saturation, harmonic exciters, baked into a mix from the ground up, can help maximize a track for loudness (if that matters to you).

If it's all going to end up normalized to -14 LUFS eventually, shouldn't you just do it yourself?

If you've read everything here so far, you already know that LUFS are a relatively new thing, that digital audio in music has been around for much longer and that the music industry doesn't care at all about LUFS. And that absolutely nothing wrong happens to your mix when turned down due to loudness normalization.
That said, let's entertain this question, because it does come up.
The first incorrect assumption is that ALL streaming platforms normalize to -14 LUFSi. Apple Music, for instance, normalizes to -16 LUFSi. And of course, any platform could decide to change their normalization target at any time.
YouTube Music (both the apps and the music.youtube.com website) doesn't do loudness normalization at all.
The Spotify web player and third party players, don't do loudness normalization. So in all these places (plus any digital downloads like in Bandcamp), your -14 LUFSi master of a modern genre, would be comparatively much quieter than the rest.

SO, HOW LOUD THEN?

As loud or as quiet as you want! Some recommendations:
  1. Forget about LUFS and meters, and waveforms. It's completely normal for tracks in an album or EP to all measure different LUFS, and streaming platforms will respect the volume relationship between tracks when playing a full album/EP.
  2. Study professional references to hear how loud music similar to what you are mixing is.
  3. Learn to understand and judge loudness with nothing but your ears.
  4. Set a fixed monitoring level using a loud reference as the benchmark for what's the loudest you can tolerate, this includes all the gain stages that make up your monitoring's final level.
  5. If you are going to use a streaming platform, make sure to disable loudness normalization and set the volume to 100%.
The more time you spend listening to music with those fixed variables in place, the sooner digital audio loudness will just click for you without needing to look at numbers.

TLDR

The long long coming (and requested) wiki article is finally here: https://www.reddit.com/mixingmastering/wiki/-14-lufs-is-quiet
submitted by atopix to mixingmastering [link] [comments]


2024.04.12 20:14 JuicyLegend Operation ENF (OPENF) - Our best shot at finding the date, time and location of the recording of TMMS.

Operation ENF (OPENF) - Our best shot at finding the date, time and location of the recording of TMMS.
### Warning, long post ###
TL;DR: I'm initiating a project, Operation ENF (OpENF), aiming to use Electrical Network Frequency (ENF) analysis to uncover the exact recording date and location of TMMS. By extracting the ENF signature from the audio and comparing it with historical grid frequency data recovered from media, I hope together with you to provide new insights into the song's origins. This initiative calls for expertise in signal processing, programming, and general community support to contribute computing power for data analysis.

Update: I have made a discord server to more easily discuss everything related to OpENF. Please keep in mind that this discord server is ONLY about OpENF related matters. You can join here.

Hello Everyone,
There have been a lot of great efforts already in trying to pinpoint the origin of TMMS. While we have some most likely scenarios, there are (to my knowledge) no concrete numbers yet on exact date(s), time(s) and location(s) of TMMS. u/SkiSTX actually made a post about the use of ENF to find the origin of TMMS last year. I want to build further upon this idea and I believe Operation ENF can change that. Operation ENF (OpENF) aims to discover the datetimes of TMMS through the ENF on the recordings. This operation is by no means an easy feat, but it is in my opinion one of our best bets to discover the date, time and perhaps even the location of the broadcasting of the ENF.

You might wonder now, what is an ENF?
This video made by Tom Scott, will explain everything in detail.
Further explanation:
The powergrid is an incredibly complex system that is in essence a medium for a constant tug of war between electricity consumers and producers. When there is a lot of demand for electricity, somewhere there are plants powering up quickly to fill the demand and mitigate a power outage. On the other hand, when there is a lot of supply of electricity the grid operators need to keep the grid from overloading and start to dump electricity somewhere. This is in the best case scenario a water dam that pumps water into a reservoir. The job of the grid operator is thus to constantly balance the grid to keep it from overloading or a power outtage to happen.
The grid has a constant frequency of 50 Hz (or 60 Hz in the US) because of the use of Alternating Current (AC) electricity. The "Alternating" part is exactly what the frequency entails, it is 50% of the time a bit more "positive" and 50% of the time a bit more "negative", to put it in simple terms.
Now, because the supply and demand is different every second of the day, there are slight variation in the frequency of the grid because of the tug of war between supply and demand on the grid. And this is what we call an Electrical Network Frequency (ENF), because it is always unique.
That means that every second of every day, there is a unique frequency graph to be made. Making a sort of time capsule.
Now why this is important is because all of the electrical devices we own actually receive that signal too. Hence, there is a frequency line at around 50 Hz (and at higher frequencies in its harmonics) in TMMS recorded on the tape, that contains a unique signal that can tell the time and date and when compared to multiple other signals from grids even the ballpark of the transmission location. According to [1], it should be possible to retrieve the original broadcast ENF from the radio an hence pinpoint the datetimes and maybe the location.

While that is very exiting, there are unfortunately some big problems.
  1. There is no ENF database from 1982-1984
  2. Because of the digitization process from the tapes, the ENF could be distorted

This is for example the ENF's I obtained from both TMMS recording from u/bluuely .

The ENF signals from both TMMS recordings. BASF 4-1 TMS.wav (black) and TMS-new 32-bit PCM.wav (blue).
Luckily, there are some solutions to circumvent these problems and that is what I propose in the plan.
I want to propose a plan that is a incredibly elaborate operation that requires a lot of teamwork and skilled people that know how to program and know how to analyse signals.
The plan is as follows:

Phase 1:

Our initial step involves extracting the clearest possible ENF signal from TMMS recordings. Advanced signal processing techniques will be employed to enhance the ENF signal and correct any anomalies, laying the groundwork for accurate analysis.

Explanation Phase 1:
We need to obtain the best possible "mains hum" from both of the TMS recordings and possibly all tapes. There are ways to enhance the mains hum and also correct any drift that has occured by digitizing the anolog tapes.
Once we have a good sample of the ENF and its harmonics, we analyze first whether it contains only 50 Hz (harmonics) or also 60 Hz (harmonics). This could already give a clue to perhaps even the origin of the song. But in any case we will know difinitifely whether or not it was for example produced in the US.
I know I already showed the ENF, but I might have introduced a bias to the signal unknowingly hence it is better to have confirmations from others.

Phase 2:

With a refined ENF signal, we'll compare it against a database of historical ENF data, primarily focusing on audio recordings from Northwest Germany around September 1984. This database will serve as a benchmark to identify the recording's timeframe.

Explanation Phase 2:
Once we have the very best ENF we can get, we are going to compare it to a database. As far as I know, there is no database that contains the ENF recordings from 1984 but luckily there are other ways to obtain the ENF's.
Our strategy will be to look for high quality audio recordings from around that time. This could be anything from radio shows, music, tv shows, basically anything that has been locally recorded or a recording of a broadcast or whatever and of which (and this is very important) the time and date of recording are known! A possible idea could be to source long running german or dutch tv shows for a source of "good" audio. It makes the most sense to focus on audio that originates from NW Germany of around September 1984.
If we are able to generate a database with the ENF's, we might potentially be able to generate a timeline of the recordings on Lydia's tapes. Because basically every audiofile on the tape, contains a timecapsule that has logged the time and date of the recording of the audio. If we are especially lucky, we might be able to find the exact date and perhaps the time that TMMS has been broadcasted.

Phase 3 :

To further narrow down the location, we'll expand our database to include ENF data from various European regions, enabling us to verify the initial findings and potentially identify the broadcast's origin more precisely.

Explanation Phase 3:
Once a local database has been created, we now need to create a database with more foreign or further away from the local area recordings. Preferabally grids that are as far away as spain, france, or further in Eastern europe (depending on which grids were actually connected in 1984).
Depending on the knowledge we have gathered in phase 1 and 2, we will strategize our gameplan accordingly. However, if we make this database we could ballpark the area the song has been broadcasted from and also have a very tiny chance we could pinpoint the actual country of origin from the TMS, but a much more likely scenario would be that we gather new information from phase 2 that would give us better directions on where to search for.
Once all of these phases have been completed, and hopefully succesfully, it should have given us the information we need on when and where the TMS was broadcasted.

Additional Insights:
  • There are actually possibilities to get an enf signature from video only, would it be the case that here are recordings with no audio, but I would have to look further into this.
  • There is also a DCF77 signal that has been broadcasted from Mainhausen since 1959. There could be a possibility that we come accross the atomic time clock signal that very precisely emitted at 77.5 kHz. However, since the broadbands were quite narrow at the time I don't think this is worth chasing after but it could be something to keep in mind.

How to Participate:
  1. Signal Analysts: If you're skilled in signal processing, your expertise can greatly assist in extracting and analyzing ENF data.
  2. Programmers: We're developing scripts to automate the analysis process. Your coding skills can enhance these tools and streamline the operation.
  3. General Volunteers: Anyone can contribute by processing data using their computers, helping us sift through the information more efficiently.
What we need to make:
  1. ENF Algorithms: We need to make algorithms that can obtain the best ENF spectra from the tapes.
  2. Search Algorithms: We need algorithms that are able to automatically compare the tape recordings to the created database.
  3. Database Algorithms: We need to make code that is easy to run for users to automatically create and correctly label the data in the database.

I will post all of my data and code in this drive folder. This folder will also primarily serve as the main folder for making the database and all of the recordings etc. (until further notice)
Update I have just made a repo on github for this project and will regularly update on the page. Here is the link.
Together we can make this work! Thank you for reading
https://preview.redd.it/zz0ffh4q93uc1.jpg?width=1500&format=pjpg&auto=webp&s=73bb709dd96a8725bc4d9c5f9210b8a5bc83f1f2
If you want to have a further read, there are multiple papers about the topic of which these are the most useful:
Especially useful:
[1] Jenkins, C. (2011). An investigative approach to configuring forensic electric network frequency databases.
[2] Su H (2014) Temporal and spatial alignment of multimedia signals. PhD thesis, University of Maryland, College Park
[3] G. Hua, H. Liao, Q. Wang, H. Zhang and D. Ye, "Detection of Electric Network Frequency in Audio Recordings–From Theory to Practical Detectors," in IEEE Transactions on Information Forensics and Security, vol. 16, pp. 236-248, 2021
Useful:
[4] Garg R, Hajj-Ahmad A, Wu M (2013) Geo-location estimation from electrical network frequency signals. In: IEEE international conference on acoustics, speech and signal processing (ICASSP), pp 2862–2866
[5] Garg R, Varna AL, Hajj-Ahmad A,WuM(2013) ‘Seeing’ ENF: Power-signature-based timestamp for digital multimedia via optical sensing and signal processing. IEEE Trans Inf Forensics Secur 8(9):1417–1432
[6] Garg R, Varna AL, Wu M (2011) ‘Seeing’ ENF: Natural time stamp for digital video via optical sensing and signal processing. In: ACM international conference on multimedia, MM ’11, New York, NY, USA. ACM, pp 23–32
[7] Garg R, Varna AL, Wu M (2012) Modeling and analysis of electric network frequency signal for timestamp verification. In: IEEE international workshop on information forensics and security (WIFS), pp 67–72
[8] Huijbregtse M, Geradts Z (2009) Using the ENF criterion for determining the time of recording of short digital audio recordings. In: Geradts ZJMH, Franke KY, Veenman CJ (eds) Computational Forensics, Lecture Notes in Computer Science, vol 5718. Springer Berlin Heidelberg, pp 116–124 G.
[9] Hua, H. Liao, H. Zhang, D. Ye and J. Ma, "Robust ENF Estimation Based on Harmonic Enhancement and Maximum Weight Clique," in IEEE Transactions on Information Forensics and Security, vol. 16, pp. 3874-3887, 2021


submitted by JuicyLegend to TheMysteriousSong [link] [comments]


http://rodzice.org/