
Search Results
242 results found with an empty search
- CANA ENTOURAGE AWARDABLE in CDAO TRADEWINDS SOLUTION MARKETPLACE
CANA ENTOURAGE has been assessed “Awardable” for Department of War work in the Chief Digital and Artificial Intelligence Office’s (CDAO) Tradewinds Solutions Marketplace. This designation means our solution has met rigorous evaluation standards and is ready for rapid acquisition by DoW customers. The Tradewinds Solutions Marketplace is the premier offering of Tradewinds, the Department of War’s (DoW’s) suite of tools and services designed to accelerate the procurement and adoption of Artificial Intelligence (AI)/Machine Learning (ML), data, and analytics capabilities. CANA’s solutions are designed to deliver analytics-driven decision advantage for federal and commercial customers. Specifically, Operations research + software engineering to turn complex mission problems into deployable tools. Specialized expertise across defense logistics, energy, and multi-domain operations. CANA’s President & CEO, Rob Cranston, acknowledged this achievement and noted, "Being selected for the ADVANA Tradewinds Solutions Marketplace is a significant milestone for CANA. Tradewinds represents the Department of War’s commitment to accelerating the adoption of operational AI and advanced analytics at speed and scale. ENTOURAGE’s approval validates both the technical rigor of our cUAS solution and its relevance to contested mission environments. We look forward to supporting rapid acquisition pathways that deliver decision advantage to the warfighter.” CANA’s video, CANA ENTOURAGE, accessible only by government customers on the Tradewinds Solutions Marketplace, demonstrates how the ENTOURAGE software tool generates and coordinates defensive barriers of many unmanned systems. This capability is vital for protecting high-value assets—such as warships, ground units, and critical infrastructure—against modern, multi-domain threats, including drone and missile attacks. CANA was recognized among a competitive field of applicants to the Tradewinds Solutions Marketplace, whose solutions demonstrated innovation, scalability, and potential impact on DoW missions. Government customers interested in viewing the video solution can create a Tradewinds Solutions Marketplace account at tradewindAI.com . Once logged in, government customers can access our video here , or search for our video titled “CANA ENTOURAGE”. Through the Marketplace, customer organizations may communicate directly with CANA to get more information or request a demonstration. When ready to procure CANA ENTOURAGE, government customers may engage their local contracting support activity. If they do not have a local contracting activity, reach out to success@tradewindai.com to be connected to the Tradewinds Acquisition Support team. About CANA: CANA is a veteran women-owned small business (VWOSB) that empowers federal and commercial organizations to thrive in a global digital world through precise and adaptable technology solutions. We fuse our rigorous analytics, top-tier talent, and specialized expertise to design customer-centric, powerful solutions. We strive to create an environment that allows our Team and Clients more time to focus on the things that matter most. For more information on CANA ENTOURAGE, contact: Chris Cichy, Director of Innovation & Strategy, ccichy@canallc.com , (951) 225-2419 About the Tradewinds Solutions Marketplace: The Tradewinds Solutions Marketplace is a digital repository of post-competition, readily awardable pitch videos that address the Department of War’s (DoW’s) most significant challenges in the Artificial Intelligence/Machine Learning (AI/ML), data, and analytics space. All awardable solutions have been assessed through complex scoring rubrics and competitive procedures and are available to Government customers with a Marketplace account. Government customers can create an account at www.tradewindai.com . Tradewinds is housed in the DoW’s Chief Digital Artificial Intelligence Office. For more information or media requests, contact: Success@tradewindai.com
- CANA's Software Development Methodology
By Joe Moreno The first time you do it, it’s a hack. The second time you do it, it’s a trick. The third time you do it, it’s a best practice. CANA has developed an ideal modern approach to software development called CANABAN that’s perfect for project-based, budget-driven organizations. CANABAN’s 11 steps melds the best of Agile/Scrum and Kanban with project management’s best practices. We’ve successfully applied our CANABAN process to both product development and custom software for commercial and government clients, enabling effective asynchronous collaboration across a fully remote workforce spanning from Maine to Hawaii. Waterfall is the first, formalized software development process going back to the 1950s. Waterfall is a linear process where work flows in fixed stages, typically Requirements → Design → Implementation → Testing → Deployment → Maintenance before returning back to the first step. This is similar to how large-scale physical systems were developed. Think: Mercury, Gemini, and Apollo space programs. It was inherently a bad idea to change something during, say, the implementation stage that wasn’t incorporated into the requirements or design stage when building physical systems. However, over time, it was noticed that moving electrons, as is done in software development, is much more flexible and forgiving than moving atoms when creating interoperable systems. That led to the development of a range of Agile software development methodologies. While Waterfall is linear, Agile is iterative and adaptive, with frequent feedback, incremental delivery, and automated testing. Waterfall treats software development steps as discrete events, while Agile treats them as a continuous process. The rise of the World Wide Web in the early 1990s created a demand for many software engineers searching for the best way to develop and deliver software. In 2001, the Agile Manifesto was published with a focus on four key values: 1. Individuals and interactions over processes and tools. 2. Working software over comprehensive documentation. 3. Customer collaboration over contract negotiation. 4. Responding to change over following a plan. Not only was software development changing rapidly in the 1990s and 2000s, but so was software deployment especially with the advent of cloud computing in 2006. Today’s Two Leading Software Development Methodologies The two leading Agile software development methodologies in use today are Scrum and Kanban. Scrum is known for its timeboxed iteration periods of software development called Sprints. Kanban is known for the exact opposite, continuous flow and incremental improvement without fixed iterations. Scrum originated before cloud servers, when deploying software was a major undertaking handled by system administrators. Each software release had to be fully packaged and production-ready, much like shrink-wrapped software distributed on disks or CD/DVD-ROMs. Cloud computing made deploying software to servers much easier by enabling automation in a process now known as Development/Operations (DevOps). Developers could simply check their code into a software repository which kicked off an automated build of the final product for testing and deployment. While Scrum was a fast and iterative process, DevOps removed the need to tie deployments to fixed dates and times. Instead, as soon as a new feature was completed, it could be deployed into production. With DevOps, Developers and Operations Engineers (system administrators) could now work on the same team, or even be the same people. The old days of packaging software with a specific version number were gone when it came to deploying software on a server. In other words, Scrum works nicely for client software running on an individual’s computer (such as an updated operating system or web browser version); whereas software running on a server could be deployed multiple times in a day or week. We are familiar with different versions of Windows or macOS running on our computers such as Windows 11 24H2 or macOS Tahoe 26.2. But we don’t think of different versions of Facebook or X.com since updates can be deployed at any time. For example, Facebook typically deploys new code with bug fixes and new features three times per business day (morning, afternoon, and evening). CANABAN: The CANA Secret to Software Development The software challenge faced at CANA is that we are a matrixed organization. Our team members will work on multiple projects throughout the month so they are matrixed into different teams while working remotely and as asynchronously as possible. This allows us to share resources across multiple projects by giving everyone at the company monthly project time allocations. One person might spend 80 hours on Project A and another 80 hours on Project B; meanwhile, another person might spend 40 hours working on four different projects throughout the month. So how do we ensure everyone is working as quickly as possible while keeping our developers, Project Managers (PMs), and Subject Matter Experts (SMEs) all in sync, on time, and within budget while maintaining expectations at every step? That’s where CANA’s 11 steps of CANABAN solved this challenge. CANA’s CANABAN combines the best of Scrum with Kanban using Jira as the PM tool. 1. Backlog: Anyone on a project can add issues to the backlog, such as new feature requests, bugs, tasks, user stories, etc. 2. SME Prep: “Definition of Done” – Subject Matter Experts/Project Leads, through discussions with software developers, add details and clarification to each Jira issue with the goal of providing minimal future guidance. User interface (UI) mockups are ideal tools for conveying a feature’s intended behavior and user experience (UX). 3. Needs Estimate: Software developers, through discussions with the SMEs, estimate how long it will take for a developer to complete and component-test a Jira issue along with creating end-to-end tests scripts. Software developer estimates are exclusive of administrative overhead such as meetings, rework, SLA, etc. Additional administrative overhead estimation is the responsibility of the Project Manager. Ideally, estimates should be generated by the developer who will be coding the tasks. Estimates older than one month should be revisited and updated if necessary. 4. Estimated: The software developer will transition a Jira issue to Estimated once they have estimated a Jira issue. 5. Funded: The Project Lead evaluates which of the estimated tasks can be completed within the allocated budget of time, money, and resources available (e.g. quarterly, etc.). Funded tasks are prioritized by the Project Lead and assigned a Label tag for the month’s targeted completion. 6. To Do (with month’s due date): The Project Lead will transition tasks targeting the upcoming/current month from Funded to To Do in order to prioritize software developer efforts and update the Label with the targeted month. “If blocked then get ahead”: If a software developer with remaining allocated hours has completed all planned Jira issues for the current month, or they are blocked from transitioning any of their other active Jira issues, they should choose a Jira issue that’s most likely to target the following month and transition it to the To Do status. When doing this, they will update the month’s due date to the following month so PMs/PLs/SMEs recognize that this Jira issue was not required to be completed in the current month. 7. In Progress: Software developers will transition Jira issues they are actively working on from To Do to In Progress. 8. Demo Ready: Software developers will transition a Jira issue from In Progress to Demo Ready when they are prepared to review it with the SME or Project Lead. Additionally, the software developers will write end-to-end test scripts using the Cypress framework. Over time, all of these test scripts are collected and automatically run when deploying into production, creating a growing suite of regression tests. 9. In Testing: The Tester, typically a Subject Matter Expert (SME) or Project Lead, will conduct a comprehensive assessment of a Jira issue in a testing environment to ensure it functions effectively and aligns with the overall needs, scope, and integrity of the product. 10. Done: The Tester will transition a Jira issue to Done once they have validated that it meets the definition of done. Trivial work (enhancements, adjustments, or changes) may be negotiated based on resource availability. Non-trivial work may require reprioritization of incomplete Jira issues or a new Jira issue added to the appropriate workflow step (i.e. Backlog, SME Prep, Needs Estimate, Estimated, or Funded). 11. Accepted: The Product Owner transitions the Jira issue from Done to Accepted once they certify that it meets the needs of the broader business or user requirements and is ready for release or deployment. This is the final step in the workflow, as the task has passed all stages, including both development and formal approval, ensuring alignment with business needs. CANABAN is more than just a set of steps; it's CANA's proven secret to effectively managing a matrixed, remote workforce within budget. By combining the best of Scrum's structure and Kanban's continuous flow, CANABAN ensures that developers, Project Managers, and Subject Matter Experts stay in sync, delivering value efficiently and consistently from the backlog all the way to customer acceptance. CANABAN embodies the CANA culture by creating an environment that allows our Team and Clients more time to focus on the things that matter most. About Joe Moreno is the Director of Software Development at CANA, LLC. He cut his software development teeth while working as a software engineer at Apple Inc. during the Steve Jobs era.
- Celebrating the Success of ETF 2025 and Our Community Leaders!
By Scott Cohick The Emerging Techniques Forum (ETF) 2025—marking its 10th anniversary—was a tremendous success. Held 2–5 December 2025 and hosted by Systems Planning & Analysis (SPA), the event brought together 96 registrants for three days of vital discussions on analytic methods, AI, and the future of operations research in national security. This year’s program reflected the rapidly shifting analytic landscape. Artificial intelligence dominated the agenda, with presenters underscoring the need for transparency, verification, and governance as AI becomes increasingly embedded in decision‑support systems. Bayesian networks saw renewed interest for their interpretability compared to black‑box models. Senior leaders emphasized accelerating the fielding of analytic tools, adopting more agile development approaches, and improving the integration of logistics and protraction‑of‑war considerations into campaign models. These conversations reinforced ETF’s role as a venue where emerging techniques directly connect to operational and strategic challenges. ETF 2025’s success was made possible through the leadership and contributions of several CANA team members. Scott Cohick, Principal Operations Research Analyst, served as Conference Chair and provided steady guidance through major challenges—including a venue change and a government shutdown—ensuring the forum remained strong and impactful. Nick Ulmer, serving as MORS President, played a central role in championing the forum’s vision and supporting its execution at the enterprise level. Walt DeGrange, CANA’s Senior Director of Analytics, contributed as a presenter, bringing forward critical insights that enriched the technical dialogue. Together, their efforts exemplified CANA’s commitment to strengthening the operations research profession. A massive thank you is also due to all the Planning Committee volunteers who met tirelessly from March through November. Your dedication was the engine behind this year’s success. Planning is already underway for an even better ETF in 2026. If you’re eager to help shape the future of operations research, please reach out to get involved.
- Beyond the Spreadsheet: Why a Clinical Lens is CANA’s Secret Weapon
When most people think of a Director of Business & People Operations, they imagine someone with a traditional MBA or a career spent strictly climbing the Human Resources ladder. They see spreadsheets, compliance checklists, and rigid corporate hierarchies. But at CANA LLC, we have always done things a little differently, and that includes the decision to create the Director of Business & People Operations position in 2023. Late in 2021, I approached CANA leadership with this question: “Do you think CANA would have a place for me?” At that point in my career, I was ready for a career change. Being a Licensed Clinical Social Worker with over 25 years of experience in behavioral health, I wasn’t certain my question would have positive follow up. But over the course of the following year, I had regular meetings with CANA’s CFO and Business Operations Lead, and we developed a positive rapport and a position description. I was hired on January 1, 2023. Choosing a clinician to lead business and people strategy was a bold, intentional move. It was a signal that at CANA, "People Focused" isn't just a tagline on a website—it is the heartbeat of our operations. The Strength of a Clinical Perspective My background in behavioral health and private practice isn't just a separate chapter of my life; it is the foundation of how I lead. In clinical work, you learn to listen for what isn’t being said. You learn that accountability and empathy aren't opposites; they are partners. At CANA, we value vulnerability within our relationships. In a typical HR setting, it is often seen as a liability. Here, my clinical background allows me to treat it as a strength. By being open and transparent in our intentions, we create a space where team members feel safe to bring their whole selves to work. My clinical lens helps me facilitate those "human" moments that a traditionally trained HR professional might overlook in favor of a policy manual. Redefining Operations Through Support and Transparency Having a diverse supervisory background—from leading program expansions to owning my own therapy practice,—has fundamentally changed how I approach the "Business" side of my title. At CANA, we support one another, and that support is built into our very structure. Navigating the "Hard" Side of People Ops: In any business, there are difficult days: interpersonal conflicts, personal struggles that bleed into the workday, and the heavy reality of terminations or unpopular decisions. My clinical training allows me to navigate these moments with a de-escalation mindset and a focus on dignity. Instead of approaching a conflict as a "violation," I approach it as a clinician—looking for the root cause, practicing active listening, and ensuring that even in termination, a person’s humanity is preserved. Having professional courage to facilitate hard conversations in a way that promotes healing rather than harm is a challenge, but not impossible. Policy Development with Clarity: When looking at policy development and/or updates, I not only ensure Defense Contract Audit Agency (DCAA) and Federal Acquisition Regulation (FAR) compliance, I work with my team to look for ways to make policies transparent and easy to navigate. We work as a team to build structure, ensuring our policies serve the people, not the other way around. Onboarding and Mentorship: Through the recruitment and onboarding of new hires, as well as with the CANA Futures internship program, I use my experience in clinical supervision to do more than just "hire." We share knowledge and focus on self-assessment and growth. We want our new hires and interns to feel the CANA Culture from day one—a culture rooted in mutual trust. Communication and Marketing: Overseeing our Digital Media and Graphics team is about more than brand awareness. It’s about transparency. We share our wins, our challenges, and our stories because we believe in being open with each other and our clients. Innovation Rooted in Wellness and Accountability A clinician’s superpower is the ability to balance high-level strategy with individual well-being. This is why the CANA Wellness Initiative is an integral part of my work and the CANA culture. Weekly informative whole-health wellness posts, virtual activities to engage team members in a casual manner, and wellness challenges to spark friendly competition are all baked into our Wellness Initiative. We aren't just checking boxes; we are learning through growth and prioritizing work-life balance in a high-stakes virtual environment. Truth be told, we are able to have fun together as a result! One of the CANA Values is to be accountable to ourselves and to each other, which naturally fosters a trusting relationship between colleagues. In therapy, accountability is only effective when an investment is made into the therapeutic process and trust is established. When a person knows their best interest is at hand when they are being challenged, that person is more likely to accept accountability for their decisions and actions. This is also a true statement for a working relationship, where trust paves the way for accountability. Accountability allows for vulnerability. Vulnerability opens the door for growth. Everyone wins. Final Thoughts At the end of the day, my role is to bridge the gap between internal business functions and people strategy. My clinical lens allows me to see the "why" behind the "what." It’s about more than just financial stability and monthly allocations—though those are vital for our profit; it’s about building a sustainable, human-centric organization. CANA took a chance on a Social Worker to lead their People Operations, and in doing so, they proved that when you put people first, business excellence follows naturally. I’m proud to show that a therapist’s heart and a director’s mind are the perfect combination for a team that values connection as much as they value results.
- Speed to Capability: Closing the Energy Gap with the Testing and Evaluation Unit
By Adam Evans Speed to Capability: Closing the Energy Gap2025 saw a significant shift in military energy strategy that we expect to continue for the next several years. The Department of War (DoW) began a deliberate shift away from broad decarbonization goals toward an emphasis on operational energy (OE) resilience and tactical autonomy, as evidenced by its renewed interest in nuclear microreactors. This move portends a 2026 that prioritizes energy dominance and security over environmental mandates. Simultaneously, DoW leadership initiated a radical overhaul of its acquisition policy through the memorandum “Transforming the Warfighting Acquisition System.” The reform policy emphasizes “speed to capability delivery” and more effective use of commercial solutions. Among the procedural changes aimed at improving contracting mechanisms is a directed preference for other transactional authority (OTA) vehicles for prototype and follow-on production items; notably, OTAs don’t incur traditional FAR restrictions. This reform promises to maximize contracting flexibility while significantly reducing the burden of current FAR (and DFARS) regulatory complexity. In concert with these policy changes, CANA members examined ways to bring advanced energy technology into operators' hands faster, with greater confidence in its effectiveness, expected performance, and compatibility. The research culminated in a draft policy paper calling for the establishment of Testing and Evaluation Energy Units (TEEUs). These units would ideally work directly with the services, DoW agencies like DIU or USD (A&S), and commercial partners to select, test, and collect data on advanced energy technologies. These units would focus on components, systems, or software that support tactical microgrids in an operational environment and help establish and validate protocols defined in the recently approved Tactical Microgrids Standards (TMS) or MIL-STD-3071. The logic behind this push is to speed up the time it takes to put the best energy technology the commercial sector has to offer into operators' hands by giving them agency in the vetting process and relying on them to provide advocacy leading up to acquisition. Additionally, this call for TEEUs addresses a critical need in the DoW's energy technology research efforts: the lack of electrical power and energy data to inform such research. As we move into 2026, advances in energy technology and the DoW’s evolving acquisition policies can work together to accelerate the time to deliver game-changing energy technology. These energy technologies should prioritize improving mission effectiveness, not mere energy efficiency. Energy acquisition and its underlying research will need to characterize the speed and ease with which such technology facilitates tactical autonomy in an increasingly austere and dispersed operational environment. The best way to do this, we believe, is through the advocacy of those who will rely on it when it matters.
- My Internship Journey: Lessons in Code, Collaboration, & Confidence
By Gavin Rosander CANA Futures Program Intern During my internship at CANA, I’ve had the opportunity to dive deep into a fast-moving, established project, evolving from core tasks to delivering system-wide impact. This journey has been defined by feature development, system validation, and the cultivation of foundational code discipline. Technical Contributions & High-Impact Projects My work focused on enhancing the intelligence and usability of a Supply Chain Assistant, a multi-agent system designed for complex data synthesis. 1. Dynamic Risk Report Tool I built the initial tool to generate risk reports based on current context and database-stored graphs. To ensure the model never lacked necessary information, I expanded this tool with: Intelligent Retrieval : I added a web scraper to gather supplementary information if the model required more context. Client Delivery : The final output is a complete, dynamic report served directly to the user for immediate download. 2. Enhancing Application Quality & UX To improve the experience for complex threads, I developed an intelligent chat history summarization feature that reduces clutter. Additionally, I am currently building a full authentication pipeline , which includes: Login & Registration : Developing the system for new application access. Security & Access Levels : Implementing role-based security to ensure appropriate access for different user types. Cosmos Integration : Integrating the system directly with Cosmos DB. Validating the Multi-Agent System A significant portion of my internship was dedicated to system validation and performance metrics. I assisted in writing performance metrics to test the reliability of agent tool calling and argument passing. One of my key results was achieving 100% accuracy in SPARQL-to-text fact-checking. I wrote and executed tests to measure how accurately the system translates technical SPARQL query results into natural language summaries. Growth Through Technical Discipline My first professional experience with large-scale testing involved writing comprehensive unit tests for all MCP tools . This process instilled a vital lesson: clean, modular code is the foundation for testability . I learned that if a function is too confusing to test, it needs to be broken down—a realization that has made me a more disciplined coder. The Power of the Environment & Mentorship Beyond the code, the environment at CANA played a crucial role in my professional development. A cornerstone of this experience was "Trusting the Process" through dedicated mentorship. I found myself in a unique and lucky position to have a mentor I trusted from day one. This relationship drastically lowered the barrier to asking "stupid questions," which was critical for accelerating my learning and making me feel comfortable contributing to the team. Having a guide who encouraged self-guided strategies for system integration added immense value to my journey. Strategy for Success: The "Sandbox" Jumping into a large, established, and "moving" project was initially daunting. To build confidence, I developed a "sandbox" strategy , creating a local, simplified version of the core components. This hands-on approach allowed me to: Safely test ideas before making changes to the main project. Rapidly understand the core architecture. Develop a personalized method for learning that was essential for my style. This combination of mentorship, technical discipline, and strategic learning allowed me to contribute meaningfully to the team while accelerating my personal growth.
- Megan's Corner: A Little HR Cheer To Close Out The Year
As the year winds down and calendars fill with holiday plans, I wanted to take a moment to share a little HR perspective — not about policies or payroll deadlines this time — but about people. The end of the year is often a mix of excitement, exhaustion, reflection, and gratitude. It’s the season where we pause just long enough to realize how much we’ve all handled, built, and learned together over the past twelve months. Whether this year brought growth, change, challenge, or all of the above — you showed up. And that matters. From an HR lens, one of the most powerful things we see during the holidays is how intentional people become. More check-ins. More thank-yous. More flexibility. More grace. And honestly? That’s something worth carrying into the new year, not just December. So as you wrap up projects, take PTO, travel, rest, or simply unplug for a bit — my hope is that you take time to reset and reflect on what you need heading into the next chapter. Burnout isn’t a badge of honor. Rest is productive. And balance isn’t a luxury — it’s necessary. On behalf of HR, thank you for another year of trust, communication, and teamwork. We’re grateful for each of you, and we’re excited for what’s ahead in 2026. Wishing you a peaceful holiday season and a strong, healthy start to the new year. — Megan #work placeculture #WFH #CANA #HR #HRAnalyst #humanresources #HRTips #workculture Megan Saylor is our Human Resource Analyst here are CANA. If you would like to get in touch with Megan, you can do so at msaylor@canallc.com or on Linkedin .
- Beyond the Algorithm: How Teaching Analytics Cultivates Indispensable Human Skills
By Walt DeGrange & Nick Ulmer In the dynamic world of analytics, technical prowess is undeniably essential, giving us the tools to transform data into insights. However, what truly distinguishes exceptional analysts—the ones who successfully drive change—is their ability to distill complex ideas into clear, actionable insights. This crucial skill, often honed in the classroom, reveals the profound connection between teaching and practicing analytics. Many recognize that teaching strengthens familiarity with analytical methods. But the deeper advantage is that teaching compels you to break down intricate mathematical concepts and present them in a way that resonates with someone unfamiliar with them . This honed explanation skill is crucial because the successful application of analytics in the real world is fundamentally a human-to-human endeavor . At CANA, we understand this relationship intimately. Many of CANA’s analytical professionals have had the opportunity to teach at the Naval Postgraduate School. Nick and I continue to teach both professional society courses and graduate-level courses. This blend of instruction and real-world practice ensures that our insights are grounded in firsthand experience. The Human Component: Analytics' Biggest Hurdle Successfully implementing a new analytic model or tool is often challenged more by dealing with the human component than by technical issues. Resistance to technological change is frequently linked to social, behavioral, and cultural issues , such as people’s resistance to change, fear of knowing the truth, or reluctance to share data or information. These are issues related to organizational and individual concerns rather than technical matters. Analysts, acting as change agents, must affect outcomes and ensure implementation success. Without strong human-to-human communication, even optimal analytic models can be met with resistance, forcing analysts to ask, “Why won’t they use our model?”. Effective human communication addresses these organizational and individual concerns, which often outweigh purely technical matters. Teaching as a Lab for Empathy and Rapport The classroom or training session serves as an invaluable testing ground, where instructors constantly refine their explanations through interaction with students possessing diverse backgrounds and varying levels of understanding. This constant refinement through teaching strengthens the ability to communicate insights effectively and cultivates the essential soft skills necessary to work effectively with non-technical stakeholders in the professional world. This experience is key to building rapport , which is communication rooted in trust. To achieve rapport, you must be able to see the situation from the other person's point of view. Teaching requires this constant orientation around the learner, forcing the instructor to anticipate needs and adapt the message dynamically. This empathy allows the analyst to make the audience care about the message by connecting the findings back to the business question, reducing the impact of potential math anxiety . Mastering the Art of Explanation Teaching refines the analyst's ability to translate complex, multi-faceted mathematical models into terms that non-technical audiences can grasp and trust. The goal is to articulate complex ideas clearly and concisely and ensure there are no comprehension barriers by explaining potentially unfamiliar topics and terms upfront. Several specific techniques honed in the classroom translate directly to the client engagement space: Using Analogies: Instructors routinely use analogies to clarify new concepts by setting them within the framework of something familiar, such as comparing hypothesis testing to a criminal trial. These analogies help link complex mathematical concepts to similar ideas in other domains, making them powerful tools for ensuring comprehension. Atomizing Knowledge: Analysts gain proficiency in atomizing knowledge , which means breaking down complex concepts into small, memorable, and teachable pieces. By exposing stakeholders to these "bite-sized chunks" over time—for instance, defining parts of a methodology over several briefings—analysts can incrementally increase understanding and build trust in the technique. Explaining Cause and Effect: The classroom forces clarity not just on the "how" of the analytical solution, but also the crucial "why" —the significance of the analytical solution to the real-world problem. Teaching reinforces that providing insights requires framing information in an engaging way that addresses the recipient’s goals. The Value of Two-Way Communication Effective communication is a two-way street. The classroom teaches the importance of listening and acknowledging , an essential component of professional interaction. In a balanced environment, students feel comfortable enough to ask questions and seek clarification. In client settings, this translates to improved performance in gathering requirements and managing stakeholders. Listening mindfully and asking targeted questions helps the analyst uncover essential, sometimes inexplicit features of the problem, allowing them to frame the ambiguous business problem into a clear, measurable analytics problem. Conclusion: The Indispensable Human Professional The ability to seamlessly convey the significance of analytical findings to clients, colleagues, and leadership—fostering a deeper understanding and crucial buy-in—is the true, practical benefit of teaching analytics. By developing these skills in an educational environment, analysts cultivate a more human, connected, and mature professional presence. Teaching what you practice creates a well-rounded and impactful professional. It ensures that while automation may provide the “how” behind analytics, the analyst remains the indispensable human element bridging the gap by providing the “why” and connecting the analysis to real-world context.
- Predicting the Unpredictable: The Challenges of Analytics in the NFL
The 2025 NFL season marks 106 years of thrilling gridiron action. With 32 teams playing a grueling 17-game schedule, the league generates a staggering amount of data. Yet, despite this abundance, the NFL presents a unique challenge for data analysts: predicting the unpredictable. Football, unlike many other sports, is inherently unpredictable. Every play is a new opportunity for momentum to shift, and the outcome can be drastically influenced by a single errant pass or a lucky bounce. The game's unique dynamics, coupled with the limited number of possessions per game, make it challenging to extract meaningful patterns from the data. This unpredictability is further compounded by the constant evolution of strategies and rule changes, which can render previously effective models obsolete. Compared to other major sports, the NFL plays a relatively small number of games, both in the regular season and the playoffs. This limited dataset poses significant challenges for analysts seeking to develop accurate predictive models. Moreover, the changing nature of the game, with evolving strategies and rule modifications, can render previously effective metrics obsolete. To address the data limitations, analysts can turn to simulation techniques. By running thousands of simulated games, it's possible to generate synthetic data that can supplement the real-world data and improve the accuracy of predictive models. Simulation can help to create a larger dataset, allowing analysts to explore more complex relationships between variables and develop more robust models. When working with limited data, there's always the risk of overfitting models to noise or assigning undue importance to metrics that have little impact on the outcome of the game. This is particularly true in football, where the unpredictability of the game can make it difficult to distinguish between meaningful patterns and random fluctuations. It's crucial to approach data analysis with a critical eye and to continually validate the effectiveness of key predictors. The challenges faced in NFL analytics, such as limited data and the unpredictable nature of the game, are not unique to this sport. Many other analytical problems, particularly in fields like finance, healthcare, and social sciences, also involve dealing with small datasets, complex systems, and the need to predict future events based on limited information. For instance, financial analysts may struggle to predict stock market trends due to the volatile nature of the market and the limited historical data available. Similarly, healthcare researchers may face challenges in developing accurate disease prediction models due to the complexity of human biology and the limited availability of patient data. By understanding the challenges and techniques used in NFL analytics, researchers in other fields can gain valuable insights and apply similar approaches to their own problems. Key Takeaways for Applying Analytics to Football Embrace the challenge of limited data: The NFL's unique characteristics present a formidable obstacle for data analysts. Utilize simulations to augment data: Synthetic data generated through simulations can help overcome the limitations of real-world data. Continuously validate key metrics: As the game evolves, it's crucial to regularly assess the effectiveness of predictive metrics. By understanding these challenges and leveraging innovative approaches, data analysts can continue to contribute to the fascinating world of NFL analytics and help teams make informed decisions in the face of uncertainty and, when applied more generally, other challenging business decisions. #NFL #NFLanalytics #Sportsanalytics #footballseason #NFL2025 #DataAnalytics #PredictTheGame Walt DeGrange Walt DeGrange is the Senior Director of Analytics here at CANA. You can contact Walt via email at wdegrange@canallc.com or on Linkedin .
- An introduction to knowledge graphs
In today’s world, data is everywhere—and it’s growing at an unprecedented pace. But having data isn’t enough. The real challenge lies in making sense of it all. How do we move beyond simply collecting isolated facts to uncovering the meaningful connections between them? Enter knowledge graphs - the key to translating raw data into key insights! A knowledge graph is a sophisticated, interconnected system for organizing information. Rather than storing isolated facts, it maps how each key piece of information is associated with others. For example, a knowledge graph will display how a person or a product (called a “node”) is connected to a location availability (labeled an “edge”). Knowledge graphs can also be enhanced with metadata if needed, e.g., to track the provenance (origin) of the data, or the relative strength / certainty of the asserted relationship. This web of relationships allows for a more contextual understanding of data. Knowledge graphs are increasingly vital tools, and at CANA, we apply them across a range of projects - such as mapping global supply chains . These maps allow us to explore intricate details, from the sourcing of individual parts, the components that might be within a part, locations of manufacturers, political and legal environments in which they operate, and even corporate leadership structures. By connecting all this information, we’re able to analyze the broader context and extract meaningful, actionable insights. We use knowledge graphs to understand not just isolated data points, but the relationships between them - because, as we all know, businesses do not operate within vacuums. What happens around them is just as critical as what happens within. So what are the big benefits of knowledge graphs? They Provide Context: A knowledge graph doesn't just tell you "Widgets R Us is a company." It can also tell you "Widgets R Us is a company that makes widgets ," and "Widgets are a type of antenna ," and "antennas are transducers ." This helps us understand the bigger picture. They Enable Smarter Search and Discovery: Users can ask more complex, conceptual questions. For example, CANA could easily find "all the suppliers of Meals-Ready-To-Eat" or "the number of American companies that build commercial drones." The graph understands the underlying meaning of your query. They Power AI and Machine Learning: The organized, clearly defined structure of knowledge graphs is a boon for AI and its ability to learn and reason from information. This makes AI applications more accurate, insightful, and capable of performing more complex tasks. They Facilitate Data Integration : Knowledge graphs can eliminate the siloing of information. They act as unifiers, connecting disparate data sources and creating an enterprise-wide view of information. At its core, a knowledge graph often uses a structure called a "triple" to represent information. A triple consists of a Subject (an entity, e.g., "Brand X Trucking"), a Predicate or Relationship (how the subject relates to the object, e.g., "is a partner of"), and an Object (the entity the subject is related to, e.g., "X Mining Corporation"). A triple can represent an internal property of a node (e.g., Brand X trucking was founded in Y year) or a connection between nodes. So , a simple piece of information like "Brand X trucking is a partner of X Mining Corporation" becomes a clear, interconnected part of the graph, ready to be linked with other facts. In this example, the “triple” structure is considered an atomic unit - the smallest meaningful piece of information the knowledge graph can use. Building a Knowledge Graph So you’re ready to build a knowledge graph, how do you actually do it? It’s a fascinating blend of data engineering and cutting-edge artificial intelligence (AI). If you’re curious to see one in action without all the heavy lifting, you can leverage a generative AI tool. Simply ask it to generate the code for a knowledge graph on a topic of your choice. For instance, prompting an AI tool with “write the code for a knowledge graph that looks at drones in the United States since 2000” will produce an HTML application. This application, often powered by D3.js, visually represents a force-directed graph, showcasing the intricate relationships between different entities like drones, manufacturers, and regulations. You can even enhance the detail by requesting for association scores, and the tool will incorporate weighted connections to show the strength of these relationships. The real beauty here is the ease with which anyone can copy, paste, and adapt this generated code to suit their specific needs. This provides an informative, albeit basic, starting point. For more robust applications, significant data validation and additional data would be necessary. However, these skills aren’t just for serious data analysis; they can be applied to other engaging and entertaining purposes as well. Imagine, for example, requesting a graph that visualizes the number of books about zombies by the author’s country of origin. The result, much like the previous example, would be generated almost instantaneously! If your interest is piqued, what are some things to consider in building a graph on your own? Ontology Design: First, the graph's purpose is defined and then its "rules of the road”. An ontology lays out all the types of entities in your graph - like "Person" or "Company" - and defines the properties these entities can have. It also specifies how relationships between entities are defined, such as "works for" or "manufactures". The ontology can be used for data validation, too, which is helpful for AI agents. For example, if an AI agent generated a knowledge graph from a news article, and some of its triples didn't abide by the ontology's vocabulary, you can pass the erroneous triples back to the agent asking it to correct the triples before proceeding. Think of it as a comprehensive rulebook, roster, and referee rolled into one. Data Collection & Preprocessing: Data for your knowledge graph can come from everywhere – databases, files, and especially unstructured text from sources like webpages and documents. This could be anything from a few documents or enormous sets of what is aptly called big data. Just like spring cleaning, once you pull everything out of the metaphorical drawers and closets, it all needs to be sorted, cleaned, and organized. Advanced AI like Natural Language Processing (NLP) and Large Language Models (LLMs) are crucial here. They can be used to identify entities (like people, places, and things) within text and determine the relationships between them. In turn, duplicate items are removed, conflicts are resolved, and everything is standardized. There’s no hoarding allowed! Data Ingestion & Storage: Knowledge graphs live in special graph databases (like Neo4j or Amazon Neptune). These offer storage and the ability to quickly navigate the intricate web of nodes and edges. Refinement & Maintenance: A knowledge graph is a tool that constantly evolves! It can deduce new facts that weren't explicitly added, like the transitive property we learned in high school. For example, if it knows "A is a type of B" and "B is a type of C," it can deduce "A is a type of C." What’s more, when new information comes in, the graph is continuously updated, ensuring it remains relevant and accurate. This overview barely scratches the surface of the power and complexity of knowledge graphs. While numerous tools are available to get you started, be prepared for challenges; even the most well-designed graphs require ongoing refinement. Stay tuned for Part II of An Introduction to Knowledge Graphs where we’ll delve into some of the specific challenges the CANA team has encountered over time and the solutions we’ve developed.
- CANA, CRATE, and the chatbot
CANA has always encouraged its team members to think outside the box and test new ideas. A recent CANA CRATE project perfectly exemplifies this spirit. CRATE stands for the CANA Research and Technology Enterprise, and it's our way to foster CANA innovation and encourage technical exploration. Our intended result is the creation of new knowledge, theories, models, or datasets that can be used by others for analysis or further research. Not only that, we often find ways to do things better, and improve the process in which we turn ideas into reality. Structurally, CANA’s CRATE projects are either externally or internally focused. An external example might be exploring energy-efficient technologies for military installations and equipment, while a recent CRATE - the BizOps Chatbot - represents an internal project. No matter what, these are good repetitions - the more we do it, the better at it we become. CANA Data Science Analyst, Kiko Whiteley, recently presented "BizOps," a custom-built chatbot designed to streamline workflows and answer frequently asked CANA Business Operations questions. This isn't another off-the-shelf solution – it's CANA ingenuity in leveraging existing tools to create cost-effective and highly tailored AI solutions. The project goals were to address frequently asked questions and free up the team from repetitive inquiries; offer a centralized, editable knowledge base; provide accessibility; and explore in-house AI development by gaining insights into the process of building custom AI tools. A Sneak Peek: BizOps in Action At one of our regular CANA Analytics Roundtables, Kiko demonstrated the chatbot’s capabilities. When asked, "When does my Paid-Time-Off accrue?", the chatbot not only provided a summary answer, but also cited relevant company documents as its source. It also effortlessly answered more general CANA-specific questions demonstrating its ability to pull information from arbitrary text within its knowledge base. A key design element is how the chatbot presents its answers: a concise summary followed by clear assertions or claims directly backed up by evidence. The Secret Sauce: Clever Engineering & Cost-Efficiency Kiko highlighted the strategic approach taken in building BizOps. While managed AI services exist, they often come with a significant cost and less flexibility. CANA's solution combines several Google Cloud functionalities with open-source software to achieve a powerful yet economical outcome. The architecture leverages: Google's Gemini f or natural language understanding and response generation. Google Chat API that enables seamless integration and accessibility across various devices. Open-source software forms the backbone of the chatbot's logic and knowledge retrieval. Containerization allows for efficient and low-cost cloud deployment, minimizing continuous running costs. Decoding the Dialogue: The Power of Prompt Engineering Kiko offered a carefully crafted prompt structure that guides the chatbot’s responses. The most effective approach involved clear instructions that guide the AI on how to interpret and answer; contextual information to provide relevant snippets from the knowledge base; the specific query from the user/team member, and final reminders to reinforce desired output formats and behaviors. The process of fetching the most relevant information is crucial. The BizOps chatbot employs: Document Chunking: Automatically splitting documents into manageable paragraph-sized segments. Vector Store: Transforming these text chunks into numerical representations. Semantic Search: Using machine learning to identify the text chunks most semantically similar to the user's question. Interestingly, the Gemini model's generous context window allowed for the entire knowledge base to be included in the prompt as a safety net. The prompts were also specifically designed to ensure responses included direct quotes and clickable links to the source material, even linking to other Google Chats for announcements. Key Takeaways & Future Potential The team had some powerful takeaways: Combining Existing Tools: There is power in integrating readily available technologies for cost-effective solutions compared to expensive managed services. Strategic Pre-processing: It is important to handle as much processing as possible outside the Large Language Model (LLM) with clear instructions and examples for higher-quality responses. Navigating Platform Nuances: The project enabled CANA to acknowledge and document learnings specific to Google Cloud for future projects. Continuous Evolution: There is value in awareness of the constant stream of new tools and features within Google Cloud. Looking Ahead Excitingly, the Business Operations project has laid the groundwork for creating other internal chatbots within CANA. Furthermore, the team has identified potential pathways to leverage this knowledge and the developed tools for external, enterprise-level applications. CANA continues to encourage team members to consider proposing their own innovative CRATE projects, emphasizing that even seemingly small experiments can yield valuable insights and advancements. This project showcases CANA's commitment to innovation and its ability to develop practical, cost-effective AI solutions by creatively combining existing technologies. The future of internal communication and efficiency at CANA looks brighter – and potentially more conversational!
- It's ALL ABOUT ENERGY
Energy resilience and independence are not just desirable – they are mission-critical. Consider the impact of Hurricane Florence on Marine Corps Base Camp Lejeune in 2018. An 11-day power outage at Camp Johnson, a key training area, resulted in major operational disruptions, training and deployment delays, and an estimated $1M loss daily during the power interruption. The recovery for the base lasted nearly a year. This reality underscores installation vulnerability, a key concern at the TEVCON event CANA attended this spring in San Diego, California. Key federal and commercial industry stakeholders were on hand to address the major challenges and identify opportunities in advancing energy resilience. So what are some ways the military is taking action? Natural disasters aren’t once in a lifetime events, and are, in fact, occurring with greater frequency as the world grapples with climate change. C oncerns like aging infrastructure, rising energy demands, supply chain disruptions, and cyber and physical threats, among others, increase the urgency of developing a wide range of energy solutions. Microgrids: Localized, independent energy networks with distributed resources like solar, wind, battery storage, and gas generators are being implemented to ensure on-site power generation and operational continuity. Renewable Energy Investments: Increasing on-site clean energy generation to reduce reliance on the grid and lower environmental impact. Energy Efficiency Measures: Reducing overall energy consumption to minimize demand. Exploring Advanced Technologies: The future includes microreactors . Small, modular nuclear reactors offer the potential for secure, reliable, and independent power generation, even in remote locations like those likely in dispersed operations. The DoD's ANPI program has a goal of two operational microreactors at military bases by 2030. The intent is for these reactors to provide 100 percent of all critical loads. Solutions will be found all along the spectrum, reinforcing the need to not rely upon a single source for energy needs. CANA's work includes research and testing on the utilization of 5G as a source of wireless power transmission, offering a potential capability that is widely available, plentiful, and discrete. Investing in energy resilience isn't just about preventing disruptions; it's about enhancing military readiness, reducing operational costs, and ensuring our forces can effectively respond to any challenge.










