English Language Arts: Combining Sources (TEKS.ELA.9-12.12.F)

Help Questions

Texas High School ELA › English Language Arts: Combining Sources (TEKS.ELA.9-12.12.F)

Questions 1 - 8
1

Source 1: Using a multi-model ensemble of downscaled climate projections, hydrodynamic simulations, and machine-learning emulators, researchers estimate compound flooding along the Houston-Galveston coast under mid-century sea-level scenarios. The study integrates surge, pluvial runoff, and tidal dynamics to produce street-level probabilistic inundation maps. Results indicate that events historically considered "100-year" floods approach decadal frequency when mean sea level rises 0.5 meters, with strongest amplification in low-lying industrial corridors. Exposure analysis links census microdata and facility registries to inundation depths, projecting substantial growth in both household and critical-infrastructure risk absent adaptive measures. Uncertainty is quantified through Bayesian model averaging, which narrows but does not eliminate variance across storm tracks. The authors emphasize that engineered defenses targeted solely at channel conveyance yield diminishing returns when surge coincides with intense rainfall, arguing for flexible, layered strategies that include restoration of coastal wetlands, updated building elevations, and regionally coordinated retreat incentives in the most repeatedly affected zones.

Source 2: Drawing on multi-year ethnography in Gulf Coast neighborhoods adjacent to refineries and ship channels, a social scientist analyzes how residents interpret, communicate, and act on hurricane risk. Participant observation during two storm seasons reveals that evacuation decisions hinge less on official surge heights than on workplace obligations, kinship care chains, and trust in informal messengers. Many refinery contractors cannot leave until supervisors authorize shutdowns; others weigh the hazards of sheltering in traffic against uncertain flood timing. Households rely on text threads and faith networks that often contradict municipal alerts, producing what the author terms "shadow mobilities"—staggered departures shaped by shift schedules and caregiving. Map-based tools rarely influence choices because they fail to represent vertical living arrangements, vehicle access, or language diversity. The study recommends co-produced contingency planning led by community intermediaries, paid leave guarantees during declared emergencies, and drills that align industrial shutdown protocols with neighborhood transportation capacity and routes.

Which statement best synthesizes the two sources to create a new, accurate understanding?

Because compound-flood models show 100-year floods will be annual events, the region should prioritize bigger channels and pumps; community outreach can follow once infrastructure is built.

Evacuation behavior is largely irrational and unpredictable, so hazard maps add little value; the only viable fix is paying everyone to leave early.

A durable strategy pairs the models' street-level mapping of compound flooding with the ethnography's insight into labor, trust, and mobility constraints: align industrial shutdowns and paid leave with phased, co-produced evacuation timelines, target green-gray defenses where exposure clusters, and design warnings through community networks so physical and social systems reduce risk together.

By integrating residents' text threads into machine learning, officials can precisely predict who will evacuate and when, making traditional infrastructure investments unnecessary.

Explanation

C accurately combines hazard modeling (compound flooding, exposure clusters) with ethnographic constraints (work obligations, trust networks) to propose an integrated solution neither source alone provides. A overrelies on engineering, misstates frequency, and sidelines social factors. B misrepresents residents and dismisses modeling. D overclaims predictive precision and dismisses needed infrastructure.

2

Source 1: An engineering assessment of the February 2021 Texas blackout models generator performance using unit-level outage data, gas pipeline pressure records, and temperature-dependent derating curves. The analysis finds that simultaneous failures of gas supply and thermal plants, rather than wind variability, drove most unserved energy. Freeze-related outages at gas-fired units compounded by wellhead and compressor freeze-offs produced cascading capacity deficits, while load forecasts underestimated electrified heating demand under subfreezing durations uncommon in design standards. Probabilistic resource-adequacy simulations show that modest winterization of fuel supply and plant auxiliaries—heat tracing, enclosure upgrades, and insulation—significantly reduce loss-of-load risk even without large new capacity additions. The study also evaluates ERCOT scarcity pricing during the event, concluding that prolonged administrative price caps neither induced new supply nor efficiently rationed demand once physical constraints bound. It recommends mandatory cold-weather standards harmonized across electricity and gas networks, with verification protocols and penalties for non-compliance and public transparency measures.

Source 2: A political economy study situates the 2021 crisis within ERCOT's long experiment with energy-only markets and institutional fragmentation between electricity regulators and gas oversight. Archival rulemaking records and interviews with market participants show that firms rationally underinvested in winter resilience because the expected value of extreme-cold revenues was discounted by regulatory uncertainty and the absence of enforceable performance obligations. The grid's limited interties and jurisdictional insulation reduced opportunities for mutual aid, while competitive neutrality norms discouraged coordination on fuel security that could advantage particular suppliers. The authors argue that scarcity pricing cannot substitute for governance capable of aligning private incentives with system reliability when failures are correlated across sectors. They propose cross-sector reliability authorities with data-sharing mandates, performance-based regulation that rewards verifiable readiness, and a resilience cost-recovery mechanism that socializes targeted investments while preserving competition. Democratic accountability, they conclude, requires clearer assignment of duties during emergencies and ex post review.

Which conclusion best synthesizes the sources to advance a fuller understanding of the blackout and reforms?

Because engineering evidence shows gas supply and thermal freeze-offs, not wind, drove outages and modest winterization sharply cuts loss-of-load, and political-economy analysis shows firms lacked incentives under fragmented governance, the most reliable path combines mandatory cross-sector cold-weather standards, verified and funded through performance-based regulation and cost-recovery, with limited interties as backup—not reliance on scarcity pricing alone.

Since the blackout stemmed from plant equipment freezing, Texas should simply insulate generators and keep price caps high; broader market reforms are unnecessary and would undermine competition.

The crisis primarily reflected wind's intermittency; ERCOT should restrict renewables until gas output stabilizes and winter demand falls.

Full interconnection with other regions would have prevented all outages, so weatherization and governance reform are secondary concerns.

Explanation

A integrates technical findings (freeze-offs, value of winterization) with incentive and governance insights (need for cross-sector authority and cost-recovery), producing a solution that neither source alone fully articulates. B privileges engineering fixes while ignoring incentive misalignment. C misrepresents the engineering evidence. D overstates what interties could guarantee and ignores documented causes.

3

Source 1: In multi-site field trials across semi-arid test stations, agronomists evaluated a CRISPR-edited sorghum line with altered stomatal density and root-to-shoot signaling. Over three seasons, the edited variety outperformed elite checks under moderate water stress by increasing water-use efficiency and maintaining grain fill during anthesis, as measured by canopy temperature depression and carbon isotope discrimination. Yield advantages disappeared under well-watered conditions and narrowed under severe drought, indicating an adaptive window rather than universal superiority. The study paired phenotyping with farmer-managed plots, finding that performance was robust to heterogeneous fertilizer practices but sensitive to planting date. Seed purity and trait stability were confirmed via off-target screening and successive generations. Authors caution that benefits depend on complementary management—timely sowing, residue retention, and local extension—arguing that gene editing should be viewed as one tool within climate adaptation portfolios, not a substitute for improved agronomy, soil moisture conservation, or investment in small-scale irrigation and storage.

Source 2: An ethics and science-technology studies analysis examines gene-edited crops through the lens of seed sovereignty among smallholders in the Sahel. Drawing on interviews with farmer cooperatives, local breeders, and public seed banks, the study finds enthusiasm for drought-resilient traits tempered by concern over intellectual property, input lock-in, and loss of community control over selection and exchange. Farmers value traits that match staggered sowing calendars, intercropping, and storage limitations more than maximum yield in trial conditions. The authors document how informal seed networks buffer climate variability by distributing risk across landraces; centralized licensing can weaken those buffers if replacement seed is costly or restricted. They propose governance arrangements that include open licensing for public-good traits, transparent trait disclosure on packets, and participatory varietal selection to align lab priorities with situated agronomy. Equity-focused benefit-sharing and liability rules for inadvertent mixing are framed as prerequisites for legitimate adoption at scale by rural smallholders.

Which synthesis statement best integrates both sources into a more insightful conclusion?

Gene editing demonstrably boosts yields, so deploying edited sorghum will solve drought impacts regardless of local practices; farmers can adapt later if needed.

Because seed sovereignty is threatened, gene-edited crops should be rejected until all intellectual property issues disappear.

Both sources discuss drought resilience in agriculture, so the main lesson is to continue research while informing farmers about options.

Taken together, the trials show a real but context-bound physiological gain that depends on management and timing, while the ethics study specifies the governance needed for durable uptake: open or fair licensing, participatory selection that matches local calendars, clear trait disclosure, and benefit-sharing. Synthesis suggests edited sorghum can aid adaptation when embedded in farmer-led systems that protect control and support agronomy.

Explanation

D fairly represents the agronomic evidence (conditional performance, need for complementary practices) and the governance/ethics findings (licensing, participation, benefit-sharing), creating a pathway neither source alone provides. A overgeneralizes the trials and ignores governance. B misrepresents the ethics source as absolutist. C repeats both topics without generating new understanding.

4

Source 1: A longitudinal quasi-experimental study of dual-language immersion in the Rio Grande Valley tracks cohorts from grade one through high school using matched comparison groups and difference-in-differences estimation. Students randomly assigned by lottery to two-way programs showed slower early English reading growth but surpassed peers by middle school in reading and math, with effects persisting to graduation: higher completion rates, more advanced coursework, and greater biliteracy certifications. The analysts control for baseline proficiency, socioeconomic status, and campus effects, and probe mechanisms through course-taking patterns, finding that sustained exposure to academic Spanish correlates with later gains in disciplinary vocabulary and metalinguistic awareness. Program quality matters: schools with stable bilingual staffing and articulated K-12 curricula produced the strongest outcomes. The authors caution that benefits diminish when pullout remediation replaces content instruction, concluding that additive bilingual models can close achievement gaps without sacrificing English attainment when implemented with coherent, long-horizon design and adequate resources.

Source 2: An ethnographic study of classrooms and community spaces in the Rio Grande Valley conceptualizes bilingualism as a translanguaging repertoire tied to cross-border identities and labor histories. Through video analysis and oral histories, the researcher documents students fluidly mobilizing Spanish, English, and regional varieties to negotiate meaning, authority, and care—from helping relatives complete forms to interpreting workplace safety briefings. Rather than discrete codes, these practices are collaborative sense-making that link school literacy with family obligations and media from both sides of the river. Rigid standardization often silences these resources, erasing regional Spanish and stigmatizing English influenced by migration. Programs that validate community literacies—inviting elders as co-teachers, designing projects around border civics, and using bilingual texts that reflect local speech—appear to deepen participation and identity affirmation. The study argues that sustained partnerships between schools and community institutions are essential for curriculum relevance and for expanding students' agency as multilingual citizens and advocates.

Which conclusion best synthesizes the quantitative and ethnographic findings into a stronger understanding of effective bilingual education?

Because dual-language immersion raises test scores by high school, schools should focus on accelerating English and standardizing Spanish to avoid community language variation that slows growth.

The quantitative study's long-run gains are most likely when programs treat bilingualism as a community resource: designing K-12 curricula and staffing to sustain additive models while partnering with families to legitimize translanguaging practices documented ethnographically. Early slower English growth functions as investment in biliteracy that later expands disciplinary learning and identity.

Ethnography shows regional Spanish can be stigmatizing, so dual-language programs risk harming identity and should be paused until better materials exist.

Because students already use border literacies outside school, formal instruction is less necessary; schools can rely on community learning to deliver bilingual outcomes.

Explanation

B accurately integrates long-term academic effects with ethnographic evidence about translanguaging and community partnerships, yielding a practical synthesis about design conditions for success. A misapplies standardization and dismisses community practices. C misrepresents the ethnographic argument. D overreaches by downplaying the role of structured instruction documented in the quantitative study.

5

Source A (engineering reliability analysis): A peer‑reviewed study in the Journal of Energy Systems analyzes the February 2021 Texas blackouts using probabilistic risk modeling and component‑level failure data. The authors attribute cascading outages primarily to correlated cold‑weather failures in gas supply, plant instrumentation, and wind turbine gearboxes, amplified by limited reserve margins. Simulations indicate that targeted weatherization of gas processing, heat tracing for sensors, and de‑icing retrofits could reduce loss of load by more than half under similar meteorological stress. The model also shows that modest DC interties to neighboring interconnections would meaningfully lower tail risk without eliminating Texas's operational autonomy. The authors caution that capacity expansion alone cannot address common‑mode failures.

Source B (regulatory and market design history): A policy monograph from a nonpartisan energy economics institute traces how Texas's energy‑only market, scarcity pricing, and fragmented gas regulation shaped investment incentives. Archival rulemaking records show persistent underinvestment in winterization after mild winters and volatile gas prices. The report argues that reliability contracts and harmonized gas‑power oversight are prerequisites for sustained resilience.

Which synthesis best integrates the engineering and policy perspectives to create a more comprehensive conclusion about improving reliability of the Texas grid?

Reliability will primarily improve if generators invest in cold‑weather hardware upgrades; market debates are secondary because technology can overcome price signals.

Since both sources highlight winter conditions, Texas should federalize its grid to eliminate local market distortions and fully rely on neighboring states for reserves.

Reducing blackout risk requires pairing targeted weatherization across fuel supply and plant instrumentation with market and regulatory reforms that reward that resilience—such as reliability contracts and aligned gas‑power oversight—supplemented by limited interties to share extreme‑event reserves.

The main takeaway is that adding more capacity will end scarcity pricing episodes; incentives will naturally correct once more plants are built, making interties unnecessary.

Explanation

Option C synthesizes Source A's evidence for specific technical fixes and risk reduction from limited interties with Source B's case that incentives and cross‑sector oversight must change to sustain such investments. A overweights engineering while sidelining incentives. B misrepresents the sources by leaping to federalization and full reliance on neighbors. D contradicts Source A's warning about common‑mode failures and ignores Source B's incentive problem.

6

Source A (remote sensing and geomorphology): A Gulf Coast ecology study uses satellite interferometry and lidar to map vertical accretion and landward marsh migration along Texas bays. It finds that many reaches lack sufficient sediment to keep pace with relative sea‑level rise and subsidence, creating bottlenecks where upland development blocks migration corridors. Modeling suggests that strategic sediment augmentation and removal of hard barriers can preserve wetland function, but benefits depend on local wave climate and tidal prism.

Source B (community‑based management): A coastal governance case series profiles Texas communities piloting living shorelines and oyster‑reef restoration. Researchers document how locally led projects improved shoreline stability and fisheries while building stewardship. Yet success hinged on co‑management agreements, flexible permitting, and labor from civic groups. Projects that ignored community priorities or maintenance capacity underperformed, even with strong biophysical designs.

Which synthesis most accurately combines the biophysical and governance findings to propose a higher‑level strategy for resilient Texas coastal restoration?

Effective restoration in Texas will couple sediment‑informed designs—such as targeted augmentation and migration corridor protection—with co‑managed living shorelines and oyster projects that match local stewardship capacity and permitting flexibility.

Because satellite models already identify where wetlands will persist, community involvement mainly serves outreach; state agencies should prioritize sediment placement according to model rankings.

Community‑led living shorelines are sufficient for resilience because local knowledge adapts to any sediment or sea‑level constraints, making geomorphic modeling unnecessary.

Texas should standardize one shoreline template statewide to streamline permits; variation in wave climate and community capacity only slows deployment.

Explanation

A integrates Source A's sediment thresholds and corridor needs with Source B's evidence that co‑management and capacity determine durable outcomes, yielding a new, actionable strategy. B reduces communities to outreach and overrelies on modeling. C overstates local knowledge by dismissing documented geomorphic limits. D ignores both the physical heterogeneity and social capacity differences highlighted by the sources.

7

Source A (computer science audit): A peer‑reviewed computer science article evaluates hiring algorithms on a benchmark dataset and shows that common bias mitigations (reweighting, adversarial debiasing) can reduce disparities on specific fairness metrics but often create trade‑offs among calibration, selection rates, and error profiles across groups. The authors emphasize the need for context‑specific, multi‑metric audits and documentation of training data provenance.

Source B (legal and ethical analysis): A law review piece argues that employers using automated hiring tools face potential disparate‑impact liability absent demonstrable job‑relatedness and necessity. It recommends procedural safeguards: impact assessments, documentation of feature validity, meaningful explanation to applicants, and accessible appeal mechanisms. The article warns that metric parity alone may not satisfy legal standards if the process remains opaque or lacks applicant recourse.

Which conclusion best synthesizes the technical and legal perspectives to advance guidance for responsible algorithmic hiring?

As long as an algorithm equalizes selection rates across groups, it will meet legal standards, so companies should focus exclusively on that metric.

Legal exposure stems mostly from public relations failures; transparent marketing combined with any bias mitigation is sufficient to avoid liability.

Because fairness metrics conflict, employers should abandon automation and return to human screening to eliminate disparate impact concerns.

Responsible deployment requires multi‑metric, data‑provenance audits and documented feature validity, paired with procedural safeguards—impact assessments, applicant explanations, and appeals—to align technical mitigation with disparate‑impact standards.

Explanation

D fuses Source A's call for multi‑metric, provenance‑aware audits with Source B's requirements for job‑relatedness, transparency, and applicant recourse, producing guidance that neither source provides alone. A overrelies on one metric and misreads the legal piece. B reduces legal risk to PR. C ignores both sources' more nuanced, practicable approaches.

8

Source A (psycholinguistic longitudinal study): A multi‑year study follows students in dual‑language programs and finds that additive bilingualism correlates with higher metalinguistic awareness and transfer of academic vocabulary across content areas. Effects are strongest where sustained exposure exists in both languages and where assessment practices value conceptual understanding rather than surface accuracy.

Source B (classroom ethnography): An ethnographic analysis of secondary classrooms observes that instructional routines and identity‑affirming discourse shape participation. Teachers who legitimize translanguaging and connect texts to students' cultural histories foster deeper argumentation and peer feedback, while rigid language policing suppresses risk‑taking, even in dual‑language settings. The study cautions against attributing outcomes to program labels without examining interactional norms.

Which synthesis statement most effectively integrates these findings to offer a higher‑order conclusion about bilingual education outcomes?

Dual‑language labels guarantee cognitive advantages; variation in classroom talk is a minor implementation detail.

Bilingual benefits emerge most reliably when program structures provide balanced exposure and assessments target concepts, and when classroom discourse legitimizes translanguaging and identity, aligning interactional norms with those structural supports.

Because classroom discourse explains most variance, structural features like assessment design and language exposure are largely irrelevant.

To avoid confusion, classrooms should enforce strict single‑language use during academic tasks; cognitive gains come from minimizing language mixing.

Explanation

B synthesizes Source A's structural conditions for additive bilingual gains with Source B's evidence that discourse norms and identity work enable those gains to materialize, creating a more complete account than either alone. A and D misrepresent the ethnography and conflict with the translanguaging findings. C dismisses the longitudinal study's structural effects.