As Greg Maddux, Tom Glavine and Frank Thomas celebrated their inductions in Cooperstown this weekend, the Baseball Hall of Fame announced a change that will make it harder for others to join them. Instead of having 15 years of eligibility for consideration by the Baseball Writers’ Association of America (BBWAA), players will now be limited to 10.1A player becomes eligible five years after retirement. If he doesn’t receive at least 5 percent of the votes the first year, he’s excluded from future ballots.One theory is that the change is designed to exclude players like Barry Bonds and Roger Clemens, who are known or suspected to have used performance-enhancing drugs.2Retired players such as Alan Trammell who have already appeared on at least 10 ballots will be exempt from the rule. But Bonds and Clemens, who joined the ballot in 2013, won’t be. But an attempt to target Bonds and Clemens could produce collateral damage. Players such as Curt Schilling, Edgar Martinez, Mike Mussina and Larry Walker — who are not strongly associated with PED use — could also be less likely to get in.Take the case of Mussina, who received 20 percent of the vote on this year’s ballot, his first year of eligibility. He might seem like a hopeless case — players need 75 percent of the vote to be elected to the Hall of Fame. But players generally gain ground the longer they remain on the ballot. Sometimes they need the full 15 years to get there.Consider other players who received somewhere between 15 and 25 percent of the vote in their first eligible season. There were 16 such players between 1966, when the Hall of Fame began holding elections every year instead of every other one, and 2000, the most recent class of players to have exhausted their 15-year eligibility window:Two of these players, Don Drysdale and Billy Williams, gained ground quickly enough to be elected to the Hall of Fame within their first 10 eligible seasons.Another three — Bruce Sutter, Bert Blyleven and Duke Snider — were elected by the BBWAA at some point between their 11th and 15th eligible seasons.One player, Red Schoendienst, was elected later by the Veterans Committee.The 10 remaining players — Gil Hodges, Jack Morris, Roger Maris, Tommy John, Mickey Lolich, Jim Kaat, Dale Murphy, Dave Parker, Thurmon Munson and Tony Oliva — have not yet made the Hall of Fame, though some are plausible candidates for election by the Veterans Committee at a later date.So by a quick-and-dirty rendering, Mussina’s chances of getting elected to the Hall of Fame by the BBWAA have been sliced from 5 in 16 (representing the five players who made it within 15 seasons) to 2 in 16 (only Drysdale and Williams made it within their first 10 seasons). He might also have some chances with the Veterans Committee. But the Veterans Committee has been stingy about electing players in recent years. The point is that players like Mussina need all the chances they can get.We can formalize this analysis by running a set of logistic regressions that estimate a player’s likelihood of eventually making the Hall of Fame based on his performance in his first year on the BBWAA ballot. First, I ran a regression to consider whether players were selected by the BBWAA within 15 seasons.3As in the Mussina example, this regression considered all players who first appeared on the ballot between 1966 and 2000. I excluded players who were elected in their first year, or who received less than 5 percent of the vote in the first year, as these players have been automatically dropped from the ballot since 1985. Then I ran another regression to evaluate whether players made it within their first 10 eligible seasons. (Among players who first appeared on the ballot in 1966 or later, those who were elected by the BBWAA somewhere between their 11th and 15th seasons were Snider, Sutter, Blyleven and Jim Rice.)4For this regression, I included players who first appeared on the ballot from 2001 through 2005, in addition to those between 1966 and 2000, since they’ve had 10 years to be elected. Finally, I considered whether players made the Hall of Fame at all — whether through the BBWAA or the Veterans Committee.5In this case, I included all players who first appeared on the ballot from 1966 through 1995 — players who began appearing on the ballot after 1995 have not yet been eligible for consideration by the Veterans Committee, as best I can tell. For this regression only, I also included players who received less than 5 percent of the vote in their first year on the ballot — a few of these players (Richie Ashburn, Larry Doby and Ron Santo) were eventually elected by the Veterans Committee. The results are represented in the chart below.To read the chart, scan across until you find a player’s vote share in his first year of eligibility — then scan up to see where the various curves intersect it. For instance, for a player like Mussina who got 20 percent of the vote in his first year:There is a 10 percent chance he gets elected within his first 10 years of BBWAA eligibility, according to the regression analysis. (This is the yellow curve.)There is a 23 percent chance he gets elected within the 15-year eligibility window. (The red curve.)There is a 34 percent chance he gets elected by either the BBWAA or eventually by the Veterans Committee. (The blue curve.)These answers aren’t too far from the quick-and-dirty numbers that I came up with before. They suggest that Mussina is an underdog to make the Hall of Fame — but more of an underdog now that he’ll have only 10 years of eligibility to do so.What about a player — such as Bonds — who got 36 percent of the vote in his first season of eligibility?He’d have a 53 percent chance of being elected by the BBWAA within 10 years.His odds of being elected within 15 years are higher — 69 percent.He has an 89 percent chance of being elected by some means — either the BBWAA or the Veterans Committee.So a player like this will also see his chances of being elected by the BBWAA decrease with the rule change. But he has a much better backstop: The Veterans Committee has usually elected players like this even when they were bypassed by the writers. That hasn’t been true for players like Mussina.Of course, Bonds and Clemens are no ordinary cases — and this method may not do a very good job of describing their chances. There are a couple of other objections that we need to consider first, however.One is that the change in rules could affect voter behavior. Players sometimes receive a boost in their vote share in their 15th and final year of eligibility. Now, knowing that it’s their last chance, the writers could rally around a player in his 10th year instead.That might protect a few players — Snider, for instance, got 71 percent of the vote in his 10th year of eligibility and might have made it then if a few more writers thought it was their last opportunity to elect him. But Blyleven had only 48 percent of the vote in his 10th year. His case, which was pushed by stat-savvy baseball fans for years, needed some extra time to marinate.Another consideration is that rotating players off the ballot sooner could clear slots for more recently retired players. BBWAA voters are limited to naming 10 players on their ballots. A few of them might have run out of room for Mussina this year, for instance, because they were reserving space for Alan Trammell, Jack Morris, or other players between their 11th and 15th years of eligibility.Indeed, this could be of some help to players like Mussina. But there would be a more direct means of providing relief — by liberalizing or eliminating the 10-player limit. Players from the 1980s, 1990s and 2000s are badly underrepresented in the Hall of Fame relative to players who had the good fortune to be born earlier.The rule change, in other words, seems designed to make the Hall of Fame more exclusive, not less so. But how might it affect Bonds and Clemens in particular?As I mentioned, they aren’t ordinary cases. For a player like Mussina, a large fraction of the BBWAA electorate might be thought of as “swing voters” — they could live with him in the Hall of Fame or without. Given how strong feelings are on the issue of performance-enhancing drugs, the choice is likely to be much more binary for Bonds and Clemens. For that reason, their vote shares might not increase as much in future seasons. (Another PED user, Mark McGwire, has been on the ballot for eight seasons and has seen his vote share decrease in almost every one.) Personally, I’d wager a fair amount of money against Bonds or Clemens ever being elected to the Hall of Fame by the writers, whether in 10 years or 15.Nevertheless, baseball’s hive mind could change its stance on PED use with the benefit of hindsight. It’s not that hard to conceive of alternate realities. NFL players who were suspended for PED use, like the former San Diego Chargers linebacker Shawne Merriman, barely seem to suffer any lasting damage to their reputations. (Merriman made the Pro Bowl in 2006, the same year he was suspended for four games.)One scenario could involve a known PED user who is otherwise a more sympathetic case than Bonds or Clemens making the Hall of Fame.6Or a player who is already in the Hall of Fame could disclose his PED use. For instance, Andy Pettitte, who admitted to using human growth hormone, is due to become eligible for the Hall of Fame in 2019. Pettitte’s case is not clear-cut on the statistical merits, but suppose he made it in 2023, his fifth year on the ballot. Under the old rules, Bonds and Clemens would have had a few years left on the ballot with that precedent in place. Now, they’ll already have exhausted their eligibility.Bonds and Clemens would still be eligible for consideration by the Veterans Committee. But whatever misgivings you might have about the BBWAA, the Veterans Committee has been far more problematic. Its rules are constantly changing, its process is not very transparent, and it has oscillated from being far too liberal to being very stingy about letting in players. Depending on the rules it drew up, the Hall of Fame could design a Veterans Committee that was relatively sympathetic to Bonds or Clemens — or firmly opposed to their election.Another theory is that the Hall of Fame doesn’t have strong feelings about Bonds and Clemens per se, but implemented the rule change in the hopes of putting the PED issue behind it sooner. It’s certainly not good advertising for Cooperstown when discussions are dominated every year by arguments over steroids.But these cases won’t go away anytime soon. Pettitte will become eligible in a few years — and a few years after him, Alex Rodriguez. Ryan Braun, another known PED user who could eventually build Hall of Fame statistics, is many years from retirement. In the meantime, players like Mussina could be caught in the crossfire.
In their first Big Ten game of the season, the Ohio State Buckeyes answered the call.OSU (5-0, 1-0) took down Wisconsin (3-2, 1-1) 31-24, thanks in part to a huge game from quarterback Braxton Miller.In his first action since spraining the MCL in his left knee against San Diego State, the junior helped the Buckeyes get off to a fast start Saturday night. Following a punt by the Badgers on the opening drive of the game, Miller connected with junior wide receiver Evan Spencer on a 25-yard scoring strike that gave OSU the lead 7-0.“I think he (Miler) played very well,” OSU coach Urban Meyer said. “Braxton did have a heck of a day.”Wisconsin would respond, though, as redshirt-senior wide receiver Jared Abbrederis caught a 36-yard touchdown pass from Badger redshirt-sophomore quarterback Joel Stave.Miller completed his second touchdown pass of the night on OSU’s next drive, this one to junior wide receiver Devin Smith. The 26-yard reception was Smith’s fifth scoring grab of 2013.Following a 45-yard field goal from OSU’s senior kicker Drew Basil, Stave led the Badgers on a 76-yard scoring drive, finishing it with an 11-yard pass to junior tight end Sam Arneson to cut the lead to 17-14 with less than two minutes left until half.The OSU offense rushed down the field on the next drive, and Miller threw his third touchdown pass in the first half to senior wide receiver Corey “Philly” Brown, giving the Buckeyes a 24-14 lead at halftime.“I felt good,” Miller said after the game. “My legs felt good energy-wise. I wasn’t out of shape. I felt good.”After another Miller-Brown connection that extended the lead to 31-14, Wisconsin’s senior running back James White scampered 17 yards to get the Badgers within 10 points again.The two teams exchanged punts until Badger redshirt-junior kicker Kyle French made a 42-yard field goal.OSU recovered the ensuing onside kick, but were forced to punt after a three and out. Wisconsin’s last ditch effort fell short, though, and the Buckeyes ran out the clock to secure the seven-point victory.Miller finished the day 17-25 passing for 198 yards and four touchdowns. He also ran the ball 22 times for 83 yards.Abbrederis had a huge game for the Badgers, finishing with 10 catches for 207 yards and a touchdown.The game was not a total victory for OSU, as senior safety broke his left ankle on Wisconsin’s last drive of the game.The Buckeyes travel to Northwestern (4-0) next Saturday for their second night game in a row. Kickoff is set for 8 p.m.
Mobile courts can operate two weeks moreThe Appellate Division of the Supreme Court has further extended by two weeks the stay on a High Court verdict that declared the operation of mobile courts by the executive magistrates illegal.Attorney general Mahbubey Alam said the executive magistrates can operate mobile courts during this time.A six-member bench of the Appellate Division led by chief justice Surendra Kumar Sinha adjourned the hearing for two weeks more following a plea by the government.On 4 July, the Appellate Division granted two weeks to the government for preparation.The matter was placed before the court for hearing. Hasan MS Azim stood for the writ petitioner.Earlier, the court stayed the hearing till 2 July.Following a stay plea by the state on the HC verdict, the chamber judge stayed the HC verdict till 18 May and sent it to the regular bench for hearing.The matter was placed before the court for hearing on 21 May.
MADRID — “La Casa de Papel” (“Money Heist”) Part 3 wowed at its world premiere in Madrid Thursday night, boosted by the presence of creator Alex Pina, a 23-strong cast headed by the Professor’s Alvaro Morte and Tokyo’s Ursula Corberó and Netflix chief content officer Ted Sarandos.Just Episode One was screened and its contents under wraps for the press until the global bow on Netflix on July 19.If the reaction of the Madrid audience was anything to go by, however – though they might be called a little biased, including multiple members of cast, Pina’s Vancouver production house crew, who emerged as one of the stars of proceedings – Ep. 1 is one 50-minute adrenaline fix.Several moments drew applause and cheers from the a knowing audience, well versed in Parts One and Two which spilled out onto the central Callao Square inMadrid enervated by the episode. ×Actors Reveal Their Favorite Disney PrincessesSeveral actors, like Daisy Ridley, Awkwafina, Jeff Goldblum and Gina Rodriguez, reveal their favorite Disney princesses. Rapunzel, Mulan, Ariel,Tiana, Sleeping Beauty and Jasmine all got some love from the Disney stars.More VideosVolume 0%Press shift question mark to access a list of keyboard shortcutsKeyboard Shortcutsplay/pauseincrease volumedecrease volumeseek forwardsseek backwardstoggle captionstoggle fullscreenmute/unmuteseek to %SPACE↑↓→←cfm0-9Next UpJennifer Lopez Shares How She Became a Mogul04:350.5x1x1.25×1.5x2xLive00:0002:1502:15 Sarandos’ presence alone speaks volumes of the importance of the show for the streaming platform not only in terms of global audience but what it says about Netflix’s gameplay: To drive up international subscriptions not only by the appeal of foreign-language shows in their own local markets but also far beyond to many of Netflix’s now 148 million subscribers around the world. Related ‘Orange Is the New Black’ Creator Jenji Kohan and Star Uzo Aduba Bid Farewell “For generations one country and one culture have been exported around the world, but what ‘La Casa de Papel’ proves is that great stories can come from anywhere in the world and make the whole world happy,” said Sarandos,Introduced to the audience by Francisco Ramos, Netflix VP, Original Series, producer of “Elite,” another big Netflix hit, Sarandos added: “The show is extraordinary. It is from Spain but viewed all over the world. From Istanbul to Paris, from Rio to Los Angeles, fans love this show, they cannot wait, they can’t get enough of this infamous men, and they are waiting for the return.”Thanking Pina and his producing partner Cristina López, Sarandos announced that “we really have the best international superstar cast to make this show possible” – an indirect reference to how the fame of the series’ actors, none real stars outside Spain before “La Casa de Papel” – has spiked hugely with the series, which was declared by Netflix in April of last year to be its most-watched non-English series ever.Pina also thanked Diego Avalos, previously in Los Angeles but named head of fiction in Spain this February, for a night of conversation “about phones and socks” before describing the La Casa de Papel gang as a band which “robs, feels and makes people laugh.” Pina’s take on his series is that it is an eminently Latin action thriller whose characters wear their emotions far more on their sleeves than more buttoned U.S. personages.Giving thanks above all to the crew of “La casa de papel” – whose work on Part 3 looks destined to be one of the talking points of the new season – Alvaro Morte delivered a message in English to “La Casa de Papel” fans: What’s Coming to Netflix in September 2019 “Thank you for being there watching the show, supporting us, sending us our love that we feel in our stomach,” he said, adding: “We are here tonight just for you. We are a band, we are a family, now you are part of that band, we are the resistance. And now you are the resistance too.”Talking about resistance to the system, Morte looks to be teasing one of the appeals of Part 3 where the band’s representation of the world’s disaffected, social write-offs, misfits, the marginalized – may become all the more clear.In a festive atmosphere, the presentations were met by cheers by a hugely enthusiastic audience. The after-premiere party will have gone on far into the night. Popular on Variety
Summer heat is becoming unbearable with every year but the only sweet recollection we associate this season with, is mangoes. The pulpy, sweet and flavourful king of fruits keeps us happy irrespective of the killer heat waves. To relish certain innovative dishes and a selection of mango-inspired dishes, one can head to The Spice Route at The Imperial in the national Capital. The collection menu created by Chef Veena Arora is all set to raise the bar for Mango lovers this season. As summer is the time to drool over mangoes and the excitement becomes extraordinary when they rule South East Asian palate. Also Read – ‘Playing Jojo was emotionally exhausting’The various dishes made using this most sought-after fruit has been relished over two decades with Chef Veena’s Summer Collection Menu. This year’s menu features Chef’s selections and creations inspired from the regions of Kerala, Sri Lanka, Thailand and Vietnam. Under the umbrella cuisine of The Spice Route, the menu promises to steal hearts with traditional specialties put together with ripe or raw mangoes, complemented with Chef’s specials. For instance Amba Isso Temperadu is the Sri Lankan style Prawns stir fried with curry powder and slivers of raw mango. Alleppy Fish Curry is made with Sole fillet and raw mango in a Kerala style curry. Kaeng Phed Phol-La-Mai is made with ripe mangoes and assorted fruits, cooked in Thai Red Curry, is truly exclusive as it is made only with fruits and takes its inspiration from Thai Duck curry. Also Read – Leslie doing new comedy special with NetflixChef Veena has delicately balanced spices with mango in all her offerings. The old favourites have been retained in the menu like Ga Xao Hot Dieu, which is a stir-fried Chicken with fresh Mangoes and Cashew nuts made in Vietnamese style. Yum Mamuang is the Chef’s special as it is basically a signature salad of fresh grated green mango tossed in spicy and tangy Thai dressing.Chef Veena Arora, Chef De Cuisine, The Spice Route says, “Summer Collection menu has been close to my heart since the time The Spice Route opened doors, and invokes timelessness for me, each time I start planning it. Inspired by fashion fraternity, the menu is whipped up to dish out something unique for the patrons from the landmark kitchen of this world famous restaurant. The interesting raw and ripe mango based South-East Asian recipes are refreshing like the advent of summer ought to be and a celebration of the season, of course. “Most of them are my own creations, wonderfully complemented by rice, chicken, prawns, sole fish, veggies and exotic spices. People in Thailand love their food with fruits and that’s why the complete menu revolves around mango, the favourite summer fruit. “Whether it is Sri Lankan prawns or the Thai Red curry with fruits, my creations this year will truly raise the satisfaction quotient for mango seekers. I have somehow tried to keep the fruit central to the palate, layering and balancing it with other flavours.”
Free Webinar | Sept. 9: The Entrepreneur’s Playbook for Going Global 4 min read Opinions expressed by Entrepreneur contributors are their own. By now, most organizations have a firm grasp on how to use video to market their brand, promote their products and connect with customers. But how can businesses leverage the power of video internally?A report from Forrester Research shows that 27 percent of firms planned to launch internally-focused enterprise video in 2013, up from 21 percent in 2010. What’s more, 75 percent of respondents of a video enterprise survey conducted by my company Kaltura said they felt that the integration of video into a company’s tools (email, social business, instant messaging, etc.) would play an important role in the near future.Related: Why Businesses Are Accelerating Investment in Video CommunicationsHere are seven ways to begin using video internally to improve productivity, collaboration and communication:1. Employee onboarding and trainingAbout 64 percent of respondents of the Kaltura survey are currently using video for training and onboarding, and for good reason. Using videos during the onboarding process and to train employees on new tasks can help improve knowledge and engagement while reducing costs.About 80 percent of respondents said that using video could make the onboarding process for new employees simpler, while 87 percent stated that using video helps train employees faster and cheaper.2. Recruiting new talentUsing video to recruit new talent can make the process easier and faster for the company and job-seekers alike. Video interviews help avoid the back and forth of scheduling, as well as travel costs.Video integration can also be used as an attractive selling point for new talent. According to a survey conducted by Cisco, 87 percent of young professionals tracked to become executives said that a company’s investment in video would influence their decision when considering otherwise equal job offers. 3. Internal communicationsA majority of respondents of the Kaltura survey said that video could have a positive impact on internal communications. Using video to make announcements, promote company initiatives and other regular interactions improves relationships between employees and executives and adds personality to stiff organizations, according to 80 percent of the survey participants.As the boundaries of business shift and industries become more global, video can serve as a tool to break down communication barriers with international team members. Among aspiring executives surveyed by Cisco, 94 percent said video can help overcome language barriers.Related: Though Few Have, Now Is the Right Time to Embrace Video Earnings Calls4. Video conferencingVideo conferencing and online meetings can connect team members working from different locations more effectively than the traditional audio conference call. This is very valuable in today’s workplace where 92 percent of millennials want to work remotely and 87 percent want to work on their own clock, per a study by oDesk.According to 82 percent of respondents from the Kaltura survey, video improves collaboration and productivity among colleagues separated by location. To that end, 76 percent of respondents agreed that video provides a close second to in-person communication, much more so than written communications.The vast majority of respondents also emphasized the value of recording live videos for subsequent viewing on demand, so that, for example, team members won’t miss important information while traveling.5. Knowledge sharingParticipants of the survey said they felt that videos were most valuable for improving knowledge sharing (95 percent of respondents!). Through video, employees can effectively share best practices and how-to tutorials. Sharing employee-generated content not only improves learning, but also boosts creativity, empowers and engages employees, and fosters stronger relationships among team members.6. Event coverageUse live video streaming of events to bring together teams working from different locations, and to build an integrated company culture. Creating an internal video news portal will also help to keep everyone in the loop and to feel connected with executives and the organization as a whole.7. Video social networkAll of the aforementioned video experiences could and should be offered within existing online environments such as corporate portals, blogs, wikis, content-management systems, learning-management systems and social-business platforms. However, 70 percent of the participants of the Kaltura survey also saw value in having a standalone video portal, which would be home for all of the live and on-demand video content that would trickle there from all other environments.Such a portal for contributing, sharing of and consuming video encourages social interaction and networking, and is not surprisingly often referred to as a “CorporateTube.”Related: 7 Mistakes That Could Turn Your Corporate Video Into a Corporate Disaster Growing a business sometimes requires thinking outside the box. December 9, 2014 Register Now »
Internet penetration has increased almost seven-fold from 6.5 to 43 per cent of the global population between 2000 and 2015, according to the International Telecommunication Union (ITU). The ITU’s latest global research claims that by the end of 2015 there will be 3.2 billion people using the internet, 2 billion of whom will be from developing countries.However, four billion people in the developing world will remain offline, with 851 million of the almost one billion people living in the least developing countries not using the internet.The report claims that by the end of the year there are more than 7 billion mobile cellular subscriptions globally, corresponding to a penetration rate of 97%, up from 738 million in 2000.“Mobile broadband is the most dynamic market segment; globally, mobile broadband penetration reaches 47% in 2015, a value that increased 12 times since 2007,” said the ITU.The ITU is an agency for information and communication technology issues, and claims to be the “focal point” for governments and the private sector in developing networks and services.
Last month, a group of Australian scientists published a warning to the citizens of the country and of the world who collectively gobble up some $34 billion annually of its agricultural exports. The warning concerned the safety of a new type of wheat. As Australia’s number-one export, a $6-billion annual industry, and the most-consumed grain locally, wheat is of the utmost importance to the country. A serious safety risk from wheat – a mad wheat disease of sorts – would have disastrous effects for the country and for its customers. Which is why the alarm bells are being rung over a new variety of wheat being ushered toward production by the Commonwealth Scientific and Industrial Research Organisation (CSIRO) of Australia. In a sense, the crop is little different than the wide variety of modern genetically modified foods. A sequence of the plant’s genes has been turned off to change the wheat’s natural behavior a bit, to make it more commercially viable (hardier, higher yielding, slower decaying, etc.). Franken-Wheat? What’s really different this time – and what has Professor Jack Heinemann of the University of Canterbury, NZ, and Associate Professor Judy Carman, a biochemist at Flinders University in Australia, holding press conferences to garner attention to the subject – is the technique employed to effectuate the genetic change. It doesn’t modify the genes of the wheat plants in question; instead, a specialized gene blocker interferes with the natural action of the genes. The process at issue, dubbed RNA interference or RNAi for short, has been a hotbed of research activity ever since the Nobel Prize-winning 1997 research paper that described the process. It is one of a number of so-called “antisense” technologies that help suppress natural genetic expression and provide a mechanism for suppressing undesirable genetic behaviors. RNAi’s appeal is simple: it can potentially provide a temporary, reversible off switch for genes. Unlike most other genetic modification techniques, it doesn’t require making permanent changes to the underlying genome of the target. Instead, specialized siRNAs – chemical DNA blockers based on the same mechanism our own bodies use to temporarily turn genes on and off as needed – are delivered into the target organism and act to block the messages cells use to express a particular gene. When those messages meet with their chemical opposites, they turn inert. And when all of the siRNA is used up, the effect wears off. The new wheat is in early-stage field trials (i.e., it’s been planted to grow somewhere, but has not yet been tested for human consumption), part of a multi-year process on its way to potential approval and not unlike the rigorous process many drugs go through. The researchers responsible are using RNAi to turn down the production of glycogen. They are targeting the production of the wheat branching enzyme which, if suppressed, would result in a much lower starch level for the wheat. The result would be a grain with a lower glycemic index – i.e., healthier wheat. This is a noble goal. However, Professors Heinemann and Carman warn, there’s a risk that the gene silencing done to these plants might make its way into humans and wreak havoc on our bodies. In their press conference and subsequent papers, they describe the possibility that the siRNA molecules – which are pretty hardy little chemicals and not easily gotten rid of – could wind up interacting with our RNA. If their theories prove true, the results might be as bad as mimicking glycogen storage disease IV, a super-rare genetic disorder which almost always leads to early childhood death. “Franken-Wheat Causes Massive Deaths from Liver Failure!” Now that is potentially headline-grabbing stuff. Unfortunately, much of it is mere speculation at this point, albeit rooted in scientific expertise on the subject. What they’ve produced is a series of opinion papers – not scientific research nor empirical data to prove that what they suspect might happen, actually does. They point to the possibilities that could happen if a number of criteria are met: If the siRNAs remain in the wheat in transferrable form, in large quantities, when the grain makes it to your plate. And… If the siRNA molecules interfere with the somewhat different but largely similar human branching enzyme as well. Then the result might be symptoms similar to such a condition, on some scale or another, anywhere from completely unnoticeable to highly impactful. They further postulate that if the same effect is seen in animals, it could result in devastating ecological impact. Dead bugs and dead wild animals. Luckily for us, as potential consumers of these foods, all of these are easily testable theories. And this is precisely the type of data the lengthy approval process is meant to look at. Opinion papers like this – while not to be confused with conclusions resulting from solid research – are a critically important part of the scientific process, challenging researchers to provide hard data on areas that other experts suspect could be overlooked. Professors Carman and Heinemann provide a very important public good in challenging the strength of the due-diligence process for RNAi’s use in agriculture, an incomplete subject we continue to discover more about every day. However, we’ll have to wait until the data come back on this particular experiment – among thousands of similar ones being conducted at government labs, universities, and in the research facilities of commercial agribusinesses like Monsanto and Cargill – to know if this wheat variety would in fact result in a dietary apocalypse. That’s a notion many anti-genetically modified organism (GMO) pundits seem to have latched onto following the press conference the professors held. But if the history of modern agriculture can teach us anything, it’s that far more aggressive forms of GMO foods appear to have had a huge net positive effect on the global economy and our lives. Not only have they not killed us, in many ways GMO foods have been responsible for the massive increases in public health and quality of life around the world. The Roots of the GMO Food Debate The debate over genetically modified (GM) food is a heated one. Few contest that we are working in somewhat murky waters when it comes to genetically modified anything, human or plant alike. At issue, really, is the question of whether we are prepared to use the technologies we’ve discovered. In other words, are we the equivalent of a herd of monkeys armed with bazookas, unable to comprehend the sheer destructive power we possess yet perfectly capable of pulling the trigger? Or do we simply face the same type of daunting intellectual challenge as those who discovered fire, electricity, or even penicillin, at a time when the tools to fully understand how they worked had not yet been conceived of? In all of those cases, we were able to probe, study, and learn the mysteries of these incredible discoveries over time. Sure, there were certainly costly mistakes along the way. But we were also able to make great use of them to advance civilization long before we fully understood how they worked at a scientific level. Much is the same in the study and practical use of GM foods. While the fundamentals of DNA have been well understood for decades, we are still in the process of uncovering many of the inner workings of what is arguably the single most advanced form of programming humans have ever encountered. It is still very much a rapidly evolving science to this day. For example, in the 1990s, an idea known simply as “gene therapy” – really a generalized term for a host of new-at-the-time experimental techniques that share the simple characteristic of permanently modifying the genetic make-up of an organism – was all the rage in medical study. Two decades on, it’s hardly ever spoken of. That’s because the great majority of attempted disease therapies from genetic modification failed, with many resulting in terrible side effects and even death for the patients who underwent the treatments. Its use in the early days, of course, was limited almost exclusively to some of the world’s most debilitating, genetically rooted diseases. Still – whether in their zeal to use a fledgling tool to cure a dreadful malady or in selfish, hurried desire to be recognized among the pioneers of what they thought would be the very future of medicine – doctors chose to move forward at a dangerous pace with gene therapy. In one famous case, and somewhat typical of the times, University of Pennsylvania physicians enrolled a sick 18-year-old boy with a liver mutation into a trial for a gene therapy that was known to have resulted in the deaths of some of the monkeys it had just been tested on. The treatment resulted in the young man’s death a few days later, and the lengthy investigation that followed resulted in serious accusations of what can only be called “cowboy medicine.” Not one of science’s prouder moments, to be sure. But could GM foods be following the same dangerous path? After all, the first GM foods made their way to market during the same time period. The 1980s saw large-scale genetic-science research and experimentation from agricultural companies, producing everything from antibiotic-resistant tobacco to pesticide-hardy corn. After much debate and study, in 1994 the FDA gave approval to the first GM food to be sold in the United States: the ironically named Flavr Savr tomato, with its delayed ripening genes which made it an ideal candidate for sitting for days or weeks on grocery store shelves. Ever since, there has been a seeming rush of modified foods into the marketplace. Modern GM foods include soybeans, corn, cotton, canola, sugar beets, and a number of squash and greens varieties, as well as products made from them. One of the most prevalent modifications is to make plants glyphosate-resistant, or in common terms, “Roundup Ready.” This yields varieties that are able to stand up to much heavier doses of the herbicide Roundup, which is used to keep weeds and other pest plants from damaging large monoculture fields, thereby reducing costs and lowering risks. In total it is estimated that modern GM crops have grown to become a $12 billion annual business since their commercialization in 1994, according to the International Service for the Acquisition of Agri-biotech Applications (ISAAA). Over 15 million farms around the world are reported to have grown GM crops, and their popularity increases every year. They’ve brought huge improvements in shelf life, pathogen and other stress resistance, and even added nutritional benefits. For instance, yellow rice – which was the first approved crop with an entirely new genetic pathway added artificially – provides beta-carotene to a large population of people around the world who otherwise struggle to find enough in their diets. However, the race for horticulturalists to the genetic table in the past few decades – what could be described accurately as the transgenic generation of research – has by no means been our first experiment with the genetic manipulation of food. In fact, if anything, it is a more deliberate, well studied, and careful advance than those that came before it. A VERY Brief History of Genetically Modified Food Some proponents of GMO foods are quick to point out that humans have been modifying foods at the genetic level since the dawn of agriculture itself. We crossbreed plants with each other to produce hybrids (can I interest you in a boysenberry?). And of course, we select our crops for breeding from those with the most desirable traits, effectively encouraging genetic mutations that would have otherwise resulted in natural failure, if not helped along by human hands. Corn as we know it, for example, would never have survived in nature without our help in breeding it. Using that as a justification for genetic meddling, however, is like saying we know that NASCAR drivers don’t need seatbelts because kids have been building soapbox racers without them for years. Nature, had the mix not been near ideal to begin with, would have prevented such crossbreeding. Despite Hollywood’s desires, one can’t simply crossbreed a human and a fly, or even a bee and a mosquito, for that matter – their genetics are too different to naturally mix. And even if it did somehow occur, if it did not make for a hardier result, then natural selection would have quickly kicked in. No, I am talking about real, scientific genetic mucking – the kind we imagined would result in the destruction of the world from giant killer tomatoes or man-eating cockroaches in our B-grade science-fiction films. Radiation mutants. Enterprising agrarians have been blasting plants with radiation of all sorts ever since we started messing around with atomic science at the dawn of the 20th century. In the 1920s, just when Einstein and Fermi were getting in their grooves, Dr. Lewis Stadler at the University of Missouri was busy blasting barley seeds with X-rays – research that would usher in a frenzy of mutation breeding to follow. With the advent of nuclear technology from the war effort, X-rays expanded into atomic radiation, with the use of gamma rays leading the pack. The United States even actively encouraged the practice for decades, through a program dubbed “Atoms for Peace” that proliferated nuclear technology throughout various parts of the private sector in a hope that it would improve the lives of many. And it did. Today, thousands of agricultural varieties we take for granted – including, according to a 2007 New York Times feature on the practice, “rice, wheat, barley, pears, peas, cotton, peppermint, sunflowers, peanuts, grapefruit, sesame, bananas, cassava and sorghum” – are a direct result of mutation breeding. They would not be classified as GM foods, in the sense that we did not use modern transgenic techniques to make them, but they are genetically altered nonetheless, to the same or greater degree than most modern GMO strains. Unlike modern GM foods – which are often closely protected by patents and armies of lawyers to ensure the inventing companies reap maximum profits from their use – the overwhelming majority of the original generations of radiation-mutated plant varieties came out of academic and government sponsored research, and thus were provided free and clear for farmers to use without restriction. With the chemical revolution of the mid-20th century, radiation-based mutations were followed by the use of chemical agents like the methyl sulfate family of mutagens. And after that, the crudest forms of organic genetic manipulation came into use, such as the uses of transposons, highly repetitive strands of DNA discovered in 1948 that can be used like biological duct tape to cover whole sections the genome. These modified crops stood up better to pests, lessened famines, reduced reliance on pesticides, and most of all enabled farmers to increase their effective yields. Coupled with the development of commercial machinery like tractors and harvesters, the rise of mutagenic breeding resulted in an agricultural revolution of a magnitude few truly appreciate. In the late 1800s, the overwhelming majority of global populations lived in rural areas, and most people spent their lives in agrarian pursuits. From subsistence farmers to small commercial operations, the majority of the population of every country, the US included, was employed in agriculture. Today, less than 2% of the American population (legal and illegal combined) works in farming of any kind. Yet we have more than enough food to feed all of our people, and a surplus to export to more densely populated nations like China and India. The result is that a sizable percentage of the world’s plant crops today – the ones on top of which much of the modern-era GMO experiments are done – are already genetic mutants. Hence the slippery slope that serves as the foundation of the resistance from regulators over the labeling of GM food products. Where do you draw the line on what to label? And frankly, how do you even know for sure, following the Wild-West days of blasting everything that could grow with some form or another of radiation, what plants are truly virgin DNA? The world’s public is largely unaware that many of the foods they eat today – far more than those targeted by anti-GMO protestors and labeling advocates – are genetically modified. Yet we don’t seem to be dying off in large numbers, like the anti-RNAi researchers project will happen. In fact, global lifespans have increased dramatically across the board in the last century. The Rise of Careful The science of GM food has advanced considerably since the dark ages of the 1920s. Previous versions of mutation breeding were akin to trying to fix a pair of eyeglasses with a sledgehammer – messy and imprecise, with rare positive results. And the outputs of those experiments were often foisted upon a public without any knowledge or understanding of what they were consuming. Modern-day GM foods are produced with a much more precise toolset, which means less unintended collateral damage. Of course it also opens up a veritable Pandora’s box of new possibilities (glow-in-the-dark corn, anyone?) and with it a whole host of potential new risks. Like any sufficiently powerful technology, such as the radiation and harsh chemicals used in prior generations of mutation breeding, without careful control over its use, the results can be devastating. This fact is only outweighed by the massive improvements over the prior, messier generation of techniques. And thus, regulatory regimes from the FDA to CSIRO to the European Food Safety Authority (EFSA) are taking increasing steps to ensure that GM foods are thoroughly tested long before they come to market. In many ways, the tests are far more rigorous than those that prescription drugs undergo, as the target population is not sick and in need of urgent care, and for which side effects can be tolerated. This is why a great many of the proposed GM foods of the last 20 years, including the controversial “suicide seeds” meant to protect the intellectual property of the large GM seed producers like Monsanto (which bought out Calgene, the inventor of that Flavr Savr tomato, and is now the 800-lb. gorilla of the GM food business), were never allowed to market. Still, with the 15 years from 1996 to 2011 seeing a 96-fold increase in the amount of land dedicated to growing GM crops and the incalculable success of the generations of pre-transgenic mutants before them, scientists and corporations are still in a mad sprint to find the next billion-dollar GM blockbuster. In doing so they are seeking tools that make the discovery of such breakthroughs faster and more reliable. With RNAi, they may just have found one such tool. If it holds true to its laboratory promises, its benefits will be obvious from all sides. Unlike previous generations of GMO, RNAi-treated crops do not need to be permanently modified. This means that mutations which outlive their usefulness, like resistance to a plague which is eradicated, do not need to live on forever. This allows companies to be more responsive, and potentially provides a big relief to consumers concerned about the implications of eating foods with permanent genetic modifications. The simple science of creating RNAi molecules is also attractive to people who develop these new agricultural products, as once a messenger RNA is identified, there is a precise formula to tell you exactly how to shut it off, potentially saving millions or even billions of dollars that would be spent in the research lab trying to figure out exactly how to affect a particular genetic process. And with the temporary nature of the technique, both the farmers and the Monsantos of the world can breathe easily over the huge intellectual-property questions of how to deal with genetically altered seeds. Not to mention the questions of natural spread of strains between farms who might not want GMO crops in their midst. Instead of needing to engineer in complex genetic functions to ensure progeny don’t pass down enhancements for free and that black markets in GMO seeds don’t flourish, the economic equation becomes as simple as fertilizer: use it or don’t. While RNAi is not a panacea for GMO scientists – it serves as an off switch, but cannot add new traits nor even turn on dormant ones – the dawn of antisense techniques is likely to mean an even further acceleration of the science of genetic meddling in agriculture. Its tools are more precise even than many of the most recent permanent genetic-modification methods. And the temporary nature of the technique – the ability to apply it selectively as needed versus breeding it directly into plants which may not benefit from the change decades on – is sure to please farmers, and maybe even consumers as well. That is, unless the scientists in Australia are proven correct, and the siRNAs used in experiments today make their way into humans and affect the same genetic functions in us as they do in the plants. The science behind their assertions still needs a great deal of testing. Much of their assertion defies the basic understanding of how siRNA molecules are delivered – an incredibly difficult and delicate process that has been the subject of hundreds of millions of dollars of research thus far, and still remains, thanks to our incredible immune systems, a daunting challenge in front of one of the most promising forms of medicine (and now of farming too). Still, their perspective is important food for thought… and likely fuel for much more debate to come. After all, even if you must label your products as containing GMO-derived ingredients, does that apply if you just treated an otherwise normal plant with a temporary, consumable, genetic on or off switch? In theory, the plant which ends up on your plate is once again genetically no different than the one which would have been on your plate had no siRNAs been used during its formative stages. One thing is sure: the GMO food train left the station nearly a century ago and is now a very big business that will continue to grow and to innovate, using RNAi and other techniques to come. The Casey Extraordinary Technology team has been tracking the leading lights of the RNAi medical industry for some time. Recently, one of our small biotech upstarts struck a potentially massive, exclusive deal with an agricultural giant to seed its own RNAi research program. Success could mean billions for both firms. If you’d like to know what company we believe will profit most from the next generation of GM food development, subscribe to CET. Bits & Bytes Last Chance for RIM? (CNN Money) Few companies have been written off as frequently as Research in Motion, whose Blackberry was once state of the art and which now finds itself fighting for its life. Its stock just soared 9% merely because it said release of the new Blackberry 10 is still on schedule for early next year. Whether the 10 will be able to put a dent into the Apple/Android monolith remains to be seen, but for RIM it could be the last, best hope. Giant Media Merger (LA Times) What do you get when you mate Han Solo with Minnie Mouse? We’re about to find out – fiscally, if not physically – with Tuesday’s announcement that Disney is acquiring Lucasfilm for a cool $4 billion. Disney is projecting it’ll get its money back within three years, while George is, well, retiring – as he is now well able to do. Google Settles Final AdWords Dispute (Ars Technica) Several companies have taken Google to court over AdWords, saying Google shouldn’t be allowed to key advertisements to their names, which are protected trademarks. The last and one of the most persistent has been Rosetta Stone, a language-software maker that sued Google in 2009, but lost in federal court. However, its case was revived on appeal, and yesterday it finally was settled on confidential terms. How Easy Is a Tablet to Use? (TechCrunch) Pretty damn easy, as it turns out. In a remarkable experiment, OLPC (One Laptop per Child) researchers in Ethiopia handed a Motorola Xoom tablet to each of a group of illiterate village children aged four to eight. Click the link to learn the amazing results.
Data will also be presented on potential risk factors for repeated or persistent hyperkalemia (Poster # SuMDP65). Source:https://www.astrazeneca.com/media-centre/press-releases/2018/the-landmark-declare-timi-58-cardiovascular-outcomes-trial-of-farxiga-in-patients-with-type-2-diabetes-to-be-featured-at-aha-01112018.html Whether clinical characteristics predicting bleeding and ischemic risk identify subgroups of patients who may derive benefit from long-term treatment with BRILINTA, with a lower risk of major bleeding (Poster #Sa2100) The effects of long-term use of BRILINTA in patients who have had a heart attack and who did not receive a coronary stent vs. those who did receive a coronary stent placement (Oral Presentation #102) The use of high-sensitivity cardiac troponin to identify patients who are at a higher-risk of major CV events (Oral Presentation #100) Reviewed by Alina Shrourou, B.Sc. (Editor)Nov 1 2018AstraZeneca will present 20 abstracts including a late-breaking oral presentation on the full results from the Phase III cardiovascular (CV) outcomes trial (CVOT) DECLARE (Dapagliflozin Effect on Cardiovascular Events)-TIMI 58, the broadest SGLT2 inhibitor CVOT conducted to date, as well as new research from the Company’s Cardiovascular, Renal & Metabolism (CVMD) therapy area at the American Heart Association (AHA) Scientific Sessions, November 10-12, 2018, in Chicago, Illinois, USA.New evidence will build on broad clinical research from AstraZeneca that aims to help redefine the management of CVMD diseases and address the need for a more proactive and holistic approach to patient care. Presentations will include findings from some of the largest trials in broad patient populations with FARXIGA (dapagliflozin) in type 2 diabetes (T2D), BRILINTA (ticagrelor) in patients with a history of heart attack, and in hyperkalemia.Danilo Verge, Vice President, Cardiovascular, Renal & Metabolism, Global Medical Affairs, said: “An estimated 20 million people each year die from cardiovascular, renal and metabolic diseases, yet shared risk factors are frequently not diagnosed or addressed holistically. Our data at AHA reflect an integrated approach to managing the needs of patients living with type 2 diabetes and risk of cardiovascular or renal disease, and those with a history of cardiovascular disease at acute and long-term risk of recurrence. We stand firmly behind our mission to provide new solutions earlier in disease management to these patients at risk for multiple complications.”DECLARE-TIMI 58: a landmark CVOT evaluating CV risk in patients with T2DClinical trial results showing the safety and efficacy of FARXIGA vs. placebo on primary CV and secondary renal efficacy outcomes in adults with T2D who have multiple CV risk factors or established CV disease, will be presented in a late-breaking oral presentation (Late Breaking Abstract #19485). DECLARE-TIMI 58 evaluated the CV outcomes of FARXIGA vs. placebo over a period of up to five years, across 33 countries and in more than 17,000 adults with T2D with multiple CV risk factors or established CV disease.Related StoriesMetformin use linked to lower risk of dementia in African Americans with type 2 diabetesNew biomaterial could encapsulate and protect implanted insulin-producing cellsObese patients with Type 1 diabetes could safely receive robotic pancreas transplantIn September 2018, AstraZeneca announced that FARXIGA met its primary safety endpoint of non-inferiority for major adverse cardiovascular events (MACE) and achieved a statistically-significant reduction in the composite endpoint of hospitalization for heart failure (hHF) or CV death, one of the two primary efficacy endpoints. Additionally, fewer MACE events were observed with FARXIGA for the other primary efficacy endpoint, however, this did not reach statistical significance. Clinical trial results presented at AHA Scientific Sessions 2018 will include additional details on the primary CV safety and efficacy, as well as secondary renal efficacy outcomes from DECLARE-TIMI 58. FARXIGA is not indicated to reduce the risk of CV events, hHF or renal outcomes.Three new sub-analyses from the PEGASUS-TIMI 54 trial will also be presented. The trial compared BRILINTA (90mg or 60mg twice daily) plus aspirin vs. aspirin alone in 21,162 patients with prior (1 to 3 years) heart attack. The sub-analyses evaluate:
Source:Mary Ann Liebert, IncJournal reference:Kim, W. et al. (2019) Tissue-Engineered Esophagus via Bioreactor Cultivation for Circumferential Esophageal Reconstruction. Tissue Engineering. doi.org/10.1089/ten.tea.2018.0277 Dr. Chung and colleagues from Korea present an exciting approach for esophageal repair using a combined 3D printing and bioreactor cultivation strategy. Critically, their work shows integration of the engineered esophageal tissue with host tissue, indicating a clinically viable strategy for circumferential esophageal reconstruction.”John P. Fisher, PhD, Tissue Engineering Co-Editor-in-Chief, Fischell Family Distinguished Professor and Department Chair, and Director of the NIH Center for Engineering Complex Tissues at the University of Maryland Reviewed by James Ives, M.Psych. (Editor)Jun 18 2019The loss of complete segments of the esophagus often results from treatments for esophageal cancer or congenital abnormalities, and current methods to re-establish continuity are inadequate. Now, working with a rat model, researchers have developed a promising reconstruction method based on the use of 3D-printed esophageal grafts. Their work is published in Tissue Engineering, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers.Eun-Jae Chung, MD, PhD, Seoul National University Hospital, Korea, Jung-Woog Shin, PhD, Inje University, Korea, and colleagues present their research in an article titled “Tissue-Engineered Esophagus via Bioreactor Cultivation for Circumferential Esophageal Reconstruction”. The authors created a two-layered tubular scaffold with an electrospun nanofiber inner layer and 3D-printed strands in the outer layer. After seeding human mesenchymal stem cells on the inner layer, constructs were cultured in a bioreactor, and a new surgical technique was used for implantation, including the placement of a thyroid gland flap over the scaffold. Efficacy was compared with omentum-cultured scaffolding technology, and successful implantation and esophageal reconstruction were achieved based on several metrics.