Its just so tragic.
Its just so tragic.
Such nonsense to suggest any new line could be built quicker than HS2. Any new route would need to go through the same consultation and design process HS2 did i.e. years delay. So its shovel ready HS2 or new line with uncertain risks and costs and time frames that’s the choice l.
If you want to review HS2 fine look at where there are issues you can resolve whilst building fine but don’t pretend there’s some mythical option which avoids all the issues inherent in a project of this scale.
Took me a while to dig out my copy but very much not. The next sections of the diary are:
It is. I liked the film but they reworked some elements of it. I don’t think it does the book justice.
“Enjoys” is not how I would describe it.
This is from The Prestige by Christopher Priest in case any one wonders. It’s a good book!
I’m pretty sure it’s real. I met someone once who worked in materials research for food and they said that modelling was big there because the scope for experimentation is more limited. In materials for construction where they wanted to change a property they could play around with adding new additives and seeing what happens. For food though you can’t add anything beyond a limited set of chemicals that already have approval from the various agencies* and therefore they look at trying to fine tune in other ways.
So for chocolate, for example, they control lots of material properties by very careful control of temperature and pressure as it solidifies. This is why if chocolate melts and resolidifies you see the white bits of milk that don’t remain within the materia.
*Okay you can add a new chemical but that means a time frame of over a decade to then get approval. I think the number of chemicals that’s happened to is very very small and that’s partly because the innovation framework of capitalism is very short term.
Yes I agree that the headline and article is silly to reference memes and undermines the study as a whole which seems more sound.
I know loads of people of take hundred of photos a day and then pay a cloud hoster (or use a “free” service) to store it indefinitely and never look back at it again.
Cloud storage isn’t straight forwardly just hard storage because its kept in data centers such that it can be downloaded at any point.
Cloud storage is replacing any sense of needing a digital archivist processes for people and businesses because it much cheaper and easier to store it just in case the data is needed again rather than actually strategetically thinking about what data is important to keep and what isn’t.
Though worth saying that the link suggests the computing was used for aerodynamics for ensuring production wouldn’t destroy them not. For the shape as such. I’ve also seem it said that the can is part of that too.
It is quite hard to track down but here’s it being reported by the head of modelling at P&G in 2006
https://www.hpcwire.com/2006/05/05/high_performance_potato_chips/
Very much so. We aren’t winning until the taps are turned off
I’m sure its small - “AI” is an unnecessary waste of resources when we can ill afford it. That said we have actual quantifiable targets (that are so tough because we’ve left it so late) for energy and emissions so it might still be the case that this also needs to change.
Sadly, ine of the things I hear quite a lot from people is the assumption that digital means it has no impact at all and they act accordingly to that assumption but when you add it up it is having a sizeable impact.
This is a consistent misunderstanding problem I wish people understood.
Manufacturing things creates emissions. It costs energy and materials. Something could have absolutely no emissions in usage and still be problematic when done on growing scales because the manufacture costs energy emissions and resources. Hard drives wear out and die and need replacing. Researchers know how to account for this its a life cycle assessment calculation they aren’t perfect but this is robust work.
IT is up to 4% of global emissions and the sector is growing. People consistently act as if there is no footprint to digital media and there is. https://www.sciencedirect.com/science/article/pii/S2666389921001884
Yes the headline is a little silly but we actually do need think strategically about the sector and that starts by actually realising it has an impact and asking ourselves what are the priorities that we went to save whilst we decarbonise the industry that supports it.
There’s no wiggle room left - no sector or set of behaviours that can afford to be given slack. We are in the biggest race of our life’s and the stake are incomprehensibly huge.
Triple the cost and start building mixed use developments on them.
I think you misunderstand what a whole life CO2 assessment is. It factors in the carbon per longetivity. Often you will also be assessing other factors like cost per co2 too.
Rail is a predominantly upfront CO2 cost in infrastructrue for much lower operational CO2 costs and as such these questions are quite important if your job is decarbonisation of Rail.
We need decarbonisation across all sectors so minimising lifetime CO2 of infrastructure - even public transport infrastructure is absolutely a priority.
Discourage people using the train during a climate crisis.
I do think HS2 will just end up being finished. No other way.
Largely limited by government desicion if I remember. Hoping the new government make it substantially easier with the flick of a pen…
The answer to your questions are: yes it’s a different baseline to the one chosen by the Paris agreement, different baselines are chosen for relevant to different elements of the issue. Likely the baseline chosen in your link is down to what reliable data they have and so they choose a baseline from a region of data they have rather than going to other sources. This website provides the latest years official record in Paris Terms I would expect the next one (2024) to be much closer to 1.5°C. On (2) I agree that current measurements suggest an instantaneous/yearly temperature around 1.5°C against the relevant baseline. On (3) you are right that the trend is unlikely to change because it comes from radiative forcing (emissions) that have already occurred so even with sudden zero human emissions we would see an increase or best case a leveling (before maybe long term it can decline as CO2 is naturally removed from the atmosphere or faster if humans find a way of doing so at scale). A trend however is already an average of several time points and you can see in the link you said that year on year variation on that number can be as high as say ~0.3°C. This comes about from non-GHG forcing elements of the system (such as El Niño) that add natural variation. So already you could see 2019-> dropped by 0.2°C even though the trend is up. So you could expect us to potentially drop back down to say 1.2°C for a few years before it goes up again. The link above suggests the best data we have we would likely breach 1.5°C by 2031 so not long at all.
This sounds like a pedantic point but it’s actually quite important for the climate and the confusion stems back to how the problem and climate science was chosen to be communicated. Temperature was chosen in part because it’s a proxy variable of other parts of the system that are what control the system impacts and it was felt that Temperature would be “naturally understandable” by the general population (and politicians…). This had a bit of a backfire because 1.5°C is not a lot of different when considered in say a room and it highlights why this variable is different and why it matters that it’s decadal average rather than a yearly. So if temperature is only a proxy then what are the variables that control the outputs? One key one is the total heat energy stored in different earth systems and there the size of the storage medium matters (so the reason 1.5°C on the world is a lot but on a room isn’t is because the sheer volume of the earth you have to have a huge amount more energy). The other place where Surface Temperature adds confusion and complexity is because of the oceans: the oceans have been absorbing some of the heat and that hasn’t always been visible to us (as we don’t live in the ocean) so if we stopped emitting today the ocean may then deposit some of that heat energy back into the atmosphere so it’s a complex interaction. What we really need to know is what the additional level of radiative forcing and how much additional heat energy swimming about in Earth’s systems - that is what will control the experience we have of the climate. Greenhouse gases act to stop Earth cooling back down by radiating out to space which is why the effect is cumulative so the difference between a sustained year on year 1.5°C and something that averages less but has a few years of 1.5°C is quite high because they will be different amounts of total energy in the system as a result.
So, the short answer is that the Paris agreement targets are set on the basis on what a decadal rise of 1.5°C by 2100 (i.e the average 2090-2100) means in terms of the excess heat energy and radiative forcing in the system. The limit itself is somewhat arbitrary driven in part by the fact we were at ~1°C when it was agreed and 2°C seemed like a reasonable estimate of something we might be able to limit it to. The origin of 1.5°C rather than 2°C is actually quite interesting and highlights a lot about how climate change policy has been decided but this post is long enough.
This is a good point. The sheet apocalyptic magnitude of the problem means that every tiny amount of change matters. Billions will die. There probably isn’t a way to prevent that completely anymore. But if we can tick things down by a fraction and save a few hundred thousand people, preserve a species of food crops that would have gone extinct, IDK what the exact outcomes are but the point is tiny changes will have a massive impact and they’re important even if the situation is dire.
Agreed, I think this is the right way of thinking about it and the risk of having communicated it to the world as a binary target of 1.5C/2C we risk people completely switching off if/when we finally confirm we’ve breached it when the reality is it should embolden us further not demoralise us. This is my number one concern at the moment. I would also add that what we doing is “pushing” a system away from it’s natural equilibrium and if we push hard enough we might find that we find changes in the system itself which are very hard or impossible to undo. So it’s more than just more increase more damages it’s also about risks of fundamentally and permanently changing the system.
As an analogy think of the ball in the well of this local minima and we push it back and forth. If we hit it hard enough rather than come back it goes and finds another minima which is just a whole different system than we are used to. These are sometimes called tipping points and the frustrating thing about the complexity of the systems is we don’t and can’t know for sure where those points are (although we do know they increase heavily as you move above 1.5C upwards). They by definition are hard to model because models are built up from prior experience (data) and these are in part unprecedented changes in the atmospheric records.
I haven’t mentioned “negative emissions” technologies but it is worth saying in principle you could have a situation where we are able to do significant negative emissions and that might mean we could end up with 1.5C in 2100 whilst having a period of time above it but negative emissions technologies could be a whole other rant. Worth noting though that lots of the pathways that show we could just about keep to 1.5C do rely on negative emissions to different degrees (though also the pathways are limited in how much they think we might be able to push our economic systems).
My understanding is that this will require new designing and consulting stages of phase 2. We have already spent about £2billion on Phase 2 which is likely not recoverable so you would need to respend at least a significant fraction of that on new design consultation and lawyers etc. So any cost savings you expect from different design requirements would need to be much greater than that (probably around 3-4% of total cost).
Yes slower services allows more flexibility with alignments but it comes at a cost of larger fleet sizes and likely more warehousing requirements(unless you reduce the passenger capacity to correspond). Speed was looked at in the original plans and found that reducing the speed somewhat did not reduce overall costs that much but did reduce the outcomes quite a lot.
The biggest problem is in the way costs have been amalgamated and communicated. HS2 had lumped in some really major project works that needed to happen anyway (notably rebuilding Euston that is currently not for for purpose for current passenger numbers) alongside at least two new stations to facilitate interconnections with the rest of the network. In other countries thought would come under separate budget lines and not look like one project.
The other big cost factor for HS2 was simply to demand more from it. We required it to be incredibly good at avoiding as much ecological disruption as possible and that meant more expensive tunnelling. It would have been the UKs only climate resilient line in the country partly as a result. So as another commenter said if its cheaper (which I would stake money it won’t be significantly) it will be at the cost of much less care towards the environment and offering a much less future proof outcome. If we wish to meet climate obligations we need massive increases in rail usage and that only begins to be possible if you free up this scale of capacity from the rest of the line.
The other thing to say is the cost is a bit if a fiction in itself. The cost is paid for by borrowing against future revenues of the service so to downgrade the service to save 1% of cost and you potentially downgrade the return even more which means you could actually cost the treasury more. This isn’t money that is available for anything else despite how its been reported.