Tuesday, June 12, 2012

Corporations Must Manage Taxes More Intelligently

If you want to read more about my views on this topic, please check out a longer, more comprehensive research note on this topic at our Web site. ###

For the past couple of years, I’ve been asserting that most larger companies (those with 1,000 or more employees) need to adopt a new approach to using software to handle their taxes comprehensively, both the direct sort (income taxes) and the indirect variety (sales and use as well as value-added/goods and service taxes). It’s an emerging enterprise challenge driven by more competent and determined tax enforcement by governments worldwide. It will require corporations to make changes in how they employ software to manage their taxes Big Fat Finance, structure their tax-related data, and manage their tax processes. Increasingly, corporations will need to be able to have better control over tax data management, tax calculation, and associated tax processes buttoned down to be able to optimize their tax liabilities while minimizing their tax risk exposure.

There are a couple of important game changers at work that fundamentally alter the way larger companies need to manage their taxes. One is a more effective use of technology by governments to collect taxes; the other is increased cooperation between taxing authorities to share information. In the United States, the Internal Revenue Service (IRS) has long shared its tax return data with individual states, and now the number of international bilateral information sharing agreements is growing Big Fat Finance, which will have a profound impact on how companies manage transfer pricing. If you don’t think this is a seismic shift, think again. A generation ago, Swiss bank secrecy was inviolate. Today, tax authorities in the United States, United Kingdom, and (soon) Germany will be getting reports from Swiss banks about their respective citizens’ accounts.


Today, few companies are prepared to deal with a more challenging tax enforcement environment. Unless they deal with it strategically, they are likely to pay more taxes and incur greater fines than necessary. Corporations must step back and rethink how they manage taxes. They must address their information, technology, and process shortcomings to achieve the lowest possible tax expense and manage their tax-related risks more effectively.



Related:

Failure to Innovate Can Be a Fatal Risk

Another company that did not continue to push the bounds of innovation is the social network provider, MySpace. As recently as 2006, MySpace was the most popular social networking website in the United States. At the height of its popularity, MySpace was acquired by media giant News Corp. According to a report this week in the Los Angeles Times, that acquisition severely limited MySpace’s ability to innovate. Here’s is how the Times contrasted MySpace with the leading social network provider, Facebook:


The focus of many risk management programs today is to avoid risk or, at the very least Big Fat Finance, to minimize risk to its lowest level. While that may seem like a rational approach given the economic crisis we have just experienced, it is not necessarily a wise approach. One of the greatest risks to any company is its failure to continually innovate. Examples abound of companies that did not continue to question how to create new products or deliver new services to meet the fickle demand of consumers. Once wildly successful companies like Blockbuster Video or America Online have seen their fortunes turn very quickly as other companies have invested in new delivery channels.


“There’s no short explanation for Myspace’s stunning fall, but people with knowledge of the situation say the social network struggled to innovate once it had been absorbed by the old-media giant. Also Big Fat Finance, Myspace’s managers came under pressure to wring profits from the site, while Facebook’s private investors were willing to absorb losses to invest in the future. Facebook engineers were ordered to make the site more engaging for users while more independent software developers were attracted to make popular applications for the site.”

Now, there are reports from various sources that News Corp. either will spin-off MySpace as a separate entity or shut down the operation altogether. Whatever path they choose, it is clear in this case that failure to innovate can be a fatal risk. ###


Related:

Successful Price Optimization Has Multiple Dimensions



Price and revenue optimization (PRO) is a business discipline used to effect demand-based pricing; it applies market segmentation techniques to achieve strategic objectives such as increased profitability or higher market share. PRO first came into wide use in the airline and hospitality industries in the 1980s as a way of maximizing returns from less flexible travelers (such as people on business trips) while minimizing the unsold inventory by selling incremental seats on flights or hotel room nights at discounted prices to more discretionary buyers (typically vacationers). Today, it is a well-developed part of any business strategy in the travel industry and increasingly used in others.

People and process meet in the ongoing evaluation of price-setting practices by a cross-functional team that incorporates all stakeholders. Initially these people will meet frequently (at least once a month), but it may only require a quarterly review as PRO matures. There also must be a well-defined price analytics review process to ensure the methodologies the company is using are sound.

Pricing strategy and execution must take into account external factors. In particular, different cultures and businesses often have their own attitudes toward fixed and negotiated pricing. In some cases, especially in consumer markets where fixed prices have been the norm, people may consider price optimization “unfair.” Companies that try to implement a PRO strategy must realize that they may encounter resistance and be careful in how their marketing and communications position their approach to pricing. That noted, despite some annoyance, people have grown accustomed to highly variable airline and hotel pricing. Also, there may be legal and regulatory issues that impinge on a company’s pricing flexibility.


Lastly, the company must acquire the right software, implement it properly and tailor it to its needs; it also should be easy to deploy and maintain. When it comes to pricing, there can be subtle differences in the needs of particular types of business; prospective buyers should focus on vendors that have strong references in their specific industry.

Above all, companies must have a realistic pricing strategy that is closely aligned with their capabilities, product strategy and competitive position. In a scale-driven business, for instance, it probably doesn’t make sense for a small player to try to be the low-cost provider. Instead, pricing software enables these companies to find ways to maximize pricing in a price-conscious market by designing offerings with valued features and services that add to their margin.


Easy, rapid access to the data needed to support the use of pricing algorithms is a prerequisite for successful implementation of a pricing strategy. Such data feeds the analytics and facilitates rapid pricing-decision cycles. Our research consistently shows that access to the appropriate data is an issue for a majority of companies and that this issue grows in proportion to the company’s size.

I’ve identified six components that corporations must consider and manage well to be successful in using PRO: strategy, external factors, people, process Economics, information and technology (software). Here are some thoughts on each of them.

As its name suggests, demand-based pricing is a method that uses the buyer’s demand, based on an estimate of a good’s or service’s perceived value to the buyer Economics, as the central element in setting price. Pricing strategies are most important because they can have a disproportionate impact (positive and negative) on a company’s bottom line. Managing prices has always been an activity of keen interest, but it has become even more so over the past decade as a result of the constrained pricing environment.



As to the people dimension, management needs to ensure that the groups involved are behind the effort. It’s extremely important that incentives (especially sales compensation) are properly aligned with the price optimization objectives that I recently covered. In many cases, ongoing training will be necessary to continually refine techniques and deal with issues that arise. For some organizations, a “center of pricing excellence” may be a useful way to build on its experience and entrench a culture of price optimization. Exactly how this is handled depends on whether the company has a centralized or decentralized structure to manage pricing.



Related:

IFRS Pros and Cons


More insights will come as more U.S.-based companies move ahead with their conversion thinking and efforts.

Fortunately, a new, more constructive mantra has appeared: “IFRS case studies are coming, IFRS case studies are coming!” This case study details United Technologies’ approach to, and insights on Economics, the conversion.

Matthew Birney is a manager in the manufacturing conglomerate’s financial reporting department responsible for International Financial Reporting Standards. He says that there are positives (access to a wider talent pool) and minuses (IFRS is more open to interpretation than GAAP) to the pending move.


The large accounting firms (those with the most SEC registrants) will be a good source for IFRS information; not only do they possess the expertise Economics, but also they have a potentially huge financial stake in the conversion, as IFRS compliance will likely prove complex, time-consuming, and profitable (or costly, depending on where you sit). ###




I’m familiar with the interpretation challenges (and believe that industry standards will emerge fairly quickly to ensure that investors can make apples-to-apples comparisons); the point on talent benefits is new and interesting.

The “IFRS is coming, IFRS is coming!” chorus has quieted a bit amidst all of the uncertainty surrounding the economic crisis, the new SEC leadership, and the future of the U.S. regulatory system.

Related:

Risk Management- What Needs Fixing, What Doesn’t

Hirth, who I will interview about the study in an upcoming post, also says that the survey respondents’ focus on risk appetite and strategic risk (which together relate to the risks companies choose to take on as the result of specific assumptions and certain organizational biases) reflect what he and his team recently have witnessed in the field.


Here are the top five corporate risk management areas in need of improvement Big Fat Finance, according to more than 600 internal auditors:

Respondents also identify compliance risk for financial reporting, public company reporting of risk Big Fat Finance, risk avoidance and the evaluation of risk reporting (both at the operating unit level and at the senior management level) as areas of relatively high competency.

1. Emerging risks;

2. Evaluating and changing risk appetite levels;

3. Setting risk appetite;

4. Defining risk appetite; and

5. Strategic risk.

This information ranks among the most compelling findings of Protiviti’s 2011 Internal Audit Capabilities and Needs Survey, a 40-plus-page report released last week.

This “Needs to Improve” analysis represents a new category within a report that has appeared each of the past five years.

The fact that financial reporting risks are regarded as relatively low would suggest that internal auditors are impressed by the long hours their finance and accounting colleagues have logged in managing internal controls in accordance with Sarbanes-Oxley. This finding also suggests that GRC currently represents more of an efficiency effort, as opposed to an effectiveness effort, as it relates to financial reporting risk.




Emerging risks are just that: risks that have yet to fully materialize. Robert B. Hirth Jr., Protiviti’s executive vice president and head of global internal audit, gives five examples: new industry rules; new business regulations (think Dodd-Frank); the impact of new technology (think smart phones); geopolitical upheaval (Libya’s effect on oil prices); and natural events (the impact of the Japanese crisis on high-tech supply chains).

The survey findings indicate that the following risk-management areas are performed with relatively high levels of competency (and therefore, are least likely to need improvement):

1. Process-level risk;

2. Functional-level risk;

3. Transaction-level risk;

4. Location-level risk; and

5. Operational risk.




Related:

Avoid Software Maintenance Fees




Benioff’s latest anti-maintenance fee rant apparently was triggered by a Oracle Siebel customer who told him that the company paid $15 million for its Siebel maintenance. Salesforce.com Economics, a software-as-a-service (SaaS) vendor, directly competes with Siebel. Not surprising that Benioff would want to use that juicy tidbit against a competitor.

Marc Benioff, CEO, Salesforce.com, is one of the few true iconoclasts in the IT industry. So, when he called for the end of software maintenance fees a few months back, it got noticed — cheered by many, reviled by others.

Software maintenance isn’t about this kind of technical support. It’s about getting routine updates and patches, something many feel they should get automatically with the license. In fact, organizations often get the updates and never install them. For the maintenance fee, you only get the update; you still have to install it, which may entail considerable work.



This is different from technical support for which you pay extra depending on the level of service you want. Technical support buys you problem resolution. If you have a technical problem and you want one-hour response directly from a support engineer, you will pay more than if you can accept a 24-hour email response.


With SaaS, the maintenance fee is inherent. Whenever the product is updated, enhanced, fixed, patched, whatever Economics, you get it automatically the next time you log on. Often you don’t even realize the software has been changed. It is part of what you buy when you pay your subscription fee. You might pay a SaaS vendor extra for additional professional services, but routine software maintenance is part of the deal.


Software maintenance fees, paid annually, amount to a percentage of the price of the license, typically 15 percent. If the cost of a license is $100,000, then the annual license fee is $15,000. Each licensed software product you use probably comes with a maintenance fee. You don’t need many products before it adds up to real money.


Maintenance fees pose an interesting problem. Supposedly they are optional, but you usually have to go to great lengths and withstand intense pressure to get it stripped out of the contract. If you don’t pay it, you won’t get software upgrades and patches or fixes. And since software always has problems, vendors sow a lot of fear, uncertainty, and doubt (FUD) to drive maintenance fees.

1. Negotiate — maintenance is negotiable. If the vendor won’t negotiate maintenance, look for another product.

2. Do without maintenance — based on your cost-risk analysis.

3. Opt for SaaS — with an SaaS product, maintenance is included.

Here are three things you can do to avoid software maintenance fees:


SaaS usually results in lower software costs because SaaS vendors have certain cost advantages, such as not having to support all the different platforms customers run. They only need to support browser access. By adding SaaS products where appropriate, you can reduce overall software costs. ###

Benioff’s comments have been reproduced all over the Internet, here and here for example. Sure they’re self-serving, but that doesn’t mean he isn’t onto something.


Related:

Why Your Board Wants Compliance Stories

In a recent post I pointed to the power of narrative as a risk. In a recent exchange, Bart Schwartz, chairman of Guidepost Solutions (as well as a corporate monitor, which represents another trend – or at least a job title – of interest for 2012), shared some ideas related to how boards might strengthen their compliance contributions.

Do CFOs, risk officers, internal audit chiefs and other GRC executives need to present facts and figures to keep boards informed of risk? Certainly. But perhaps the impact of these numbers will sink in deeper if they are accompanied by anecdotes Economics, illustrations and other elements that connect with our human need for engaging stories.


● Boards should increase their scrutiny of major risks that have not blossomed – “not because the risk is any less” Schwartz explains, “but because management may have become too accustomed to the risk and too blasé about managing it.”



I’ve been talking to risk management, compliance and internal auditing experts this month to get a feel for how they expect their realms to evolve during the next 12 to 18 months. I’ve heard some interesting ideas. I’ve also heard the same interesting idea repeated more than once; and, as the saying goes, “here’s how journalists (or bloggers) count to three: one, two Economics, trend.”

● Boards should set up a compliance and risk committee focused on the interplay between an effective enterprise risk management program and a compliance program that helps mitigate those risks to the company’s strategy, reputation, financials and operations.


Board members typically receive 200 to 300 pages of information, much of it risk-focused, to review each month, according to a Protiviti article. “Despite this abundance of data, quality analysis to steer recipients to salient points is often missing.” This shortcoming prodded the risk consulting firm to develop a customizable risk index that tells a crucial story through the elegant simplicity of a single number. The index does so by addressing two crucial questions boards need answered:



1. Is our organization riskier today that it was yesterday?

2. Is our organization likely to become riskier tomorrow than it is today?

In addition to the suggestion that boards should request more anecdotes related to risk issues from their executive partners, Bart Schwartz, chairman of Guidepost Solutions, and Ken Handal, president of GRC at Guidepost Solutions, shared several other recommendations, including the following:


● Boards should ensure that they have the time and tools to bring these issues and solutions to the forefront in their deliberations.




Storytelling is a powerful tool, as Schwartz’s recommendation confirms. But his point also suggests that the traditional forms of risk information (namely, facts and figures) presented to the board often fall short of their intended objective: keeping the board effectively informed about the organization’s changing risk profile. This shortcoming may also qualify as a trend, as I’ve also reported before.

Count storytelling within the realm of risk management among one of the many trends (including lean GRC, behavioral risk management, principled performance, correlations between business ethics and the bottom line and the death of SAS 70 audits) I’m examining right now.




Three More GRC Tips for Boards


Schwartz suggests that boards of directors should request “more anecdotal information, rather than relying on statistics when reviewing the effectiveness of a compliance program.” (For a few other suggestions on the board’s approach to GRC issues, see the sidebar below.)

Related:

Toxic Market Plan Summary, IFRS Insights, and Other Links

Here’s what I’m reading today to help me make sense of the turbulent world of GRC:

And three more pieces of content, linked to here as much for levity and context as for GRC value:

• Financial Media Exposed to Transparency;

• You Think the U.S. Has Transparency Problems?

• You Think Your Job Is Tough? ###

Too often, I come across (and save) useful content for you, dear readers Big Fat Finance, that winds up lingering forever on my hard drive because too many other issues, developments Big Fat Finance, and blog entries push this useful content aside.

• Cliff Notes on Toxic Assets Plan;

• Excellent IFRS Insights;

• The Bonus Tax is … Stupid;

• The Bonus Tax is … the Smoot-Hawley Tariff?

• Bernanke Calls for Super-Regulator

To address this challenge, I plan to periodically post a series of links identified succinctly by their subject.





Related:

Access to Funding Remains Tight


In July Economics, about two-thirds of small and midsize businesses responding to a survey indicated that their access to credit was more limited now than it was at the same time in 2008, Greenwich Associates reports. “Credit availability for small and midsize businesses continues to be a major issue,” said Greenwich consultant Steve Busby in a release.

Even as small signs that the economy is stabilizing make headlines – the S&P 500 is up 20 percent year-to-date, and pending home sales rose for the fifth month in a row, according to the National Association of Realtors – corporate treasurers still face the sorts of challenges that can can cause even sound sleepers to bolt wide awake in the middle of the night.



Until access to credit improves enough that corporate financial execs feel confident opening their purse strings, the economy will continue to languish. And treasurers will face more nights of tossing, turning, and wondering just how secure their corporate funding is. ###

Larger companies continue to face their own struggles. The amount of commercial paper outstanding that was issued by nonfinancial companies dropped from $209 billion in January to $117 billion in June Economics, according to Federal Reserve data. The average number of AA nonfinancial firms issuing paper dropped from 193 in March 2009 to 137 in July, a decline of about 30 percent. That’s also according to the Fed.


Among the most persistent has been the lack of credit. In May of this year, more than one-fourth of companies surveyed by the Association for Financial Professionals indicated that credit was tighter than it had been. As a result, nearly all the firms had taken some action to survive the credit crunch, including chopping capital spending and hiring.

Related:

IRS Becoming Better at Identifying Noncompliant Retirement Plans

If your company’s retirement plan isn’t following the rules, the IRS is increasingly likely to notice. That’s the conclusion of a recently released report by the Treasury Inspector General for Tax Administration, or TIGTA. Given the nearly 900,000 retirement plans in existence today, that’s key.

The Employee Plans function within the IRS is charged with determining if retirement plans are complying with the tax-exempt provisions of the Internal Revenue code, as well as according to the terms of the plan document. For instance, they check that plan sponsors are making contributions as required and that assets exist to satisfy the liabilities. This is important both for corporate finance execs who want to be confident that their plans are operating legally, as well as for employees participating in the plan.



Over the past few years, the IRS’ Employee Plans function has become more adept at identifying plans that are noncompliant. As a result, the percentage of exams that led to a change in the filer’s tax returns has jumped from 47 percent in 2006 to 64 percent in 2010. That’s a positive step, as it means the IRS’ resources are focused where they’re most needed. It also means that plans (and their participants) are better protected.


The growth in the number of exams of noncompliant plans is a result of several factors. As a starting point, the IRS’ continuing analysis of historical results allows it to regularly update its knowledge of the retirement plan universe.

These types of exams typically result in more changes to the plans’ returns. In fact, in 2010, some change was made to more than 80 percent of returns examined as a result of special projects, abusive transactions or referrals. While that’s presumably not good news for the plans involved, it should help the plan participants. It also indicates this should be an area of focus for the IRS.

The IRS also analyzes retirement plans by plan type and the plan sponsors’ principal business. While these exams don’t produce the volume of changes other exams do, the rate of change has been increasing as well, as the IRS has made its sampling methods more efficient. For instance, the percentage of changes to profit-sharing plans from within the wholesale industry jumped from 23 percent in the 1990s to 40 percent between 2006 and 2010.

In addition, the IRS has boosted the number of plan examinations that involve special projects, abusive transactions and referrals. Special projects refers to plans that are selected for exams based on an analysis of historical data Big Fat Finance, combined with other information, such as changes to the tax laws. Abusive transactions are those intended to capture illegal tax benefits. Referrals can come from within the IRS or other agencies like the Department of Labor.




“In these tough economic times Big Fat Finance, it is even more important that the IRS ensure that retirement plans comply with all applicable statutes and regulations to provide plan participants with greater assurance that promised benefits will be available upon retirement,” said J. Russell George, the Treasury Inspector General for Tax Administration, in a statement announcing the report.


Related:

Dodd-Frank- Beyond Say on Pay

I was reminded of this need after discussing 2011 compensation trends with three compliance experts at The Hay Group. Irv Becker Economics, who leads the firm’s U.S. compensation practice, suggested that I also consider another provision in the new regulation — “Additional Disclosure Requirements” – likely to require a significant amount of reflection and work.

Earlier this week, I discussed the “say on pay” provision within the Dodd-Frank Wall Street Reform and Consumer Protection Act. It’s something I will be writing about quite a bit in the coming month – without overdoing it.


This provision of Dodd-Frank requires companies to disclose “the median of the annual total compensation” of all of the company’s employees excluding the CEO, the annual total compensation of the CEO, and the ratio between the two. So, this provision will also require a significant amount of calculating. This article can help you and/or your compensation committee to be prepared when it comes time to conduct the reflection, math Economics, and legwork the law’s new disclosures require. ###


Related:

Complying with Import Regulations

Finally, you don’t want to assume that just because your customs broker is handling much of the paperwork, he or she will fix any problems that arise. That’s not quite how it works, Pomerantz notes. To Customs, the broker is acting on the importer’s behalf. If the broker makes a mistake, it’s up to your firm to set things straight. “Final responsibility lies with the importer,” she says.




Operating under any one of these assumptions and misperceptions can cost your firm time and money. It’s much more cost-effective to manage the process correctly from the get-go. ###

While the value of goods and services imported to the U.S. fell from $217 billion in April 2008 to $150.3 billion in April 2009 according to the Bureau of Economic Analysis, corporate finance managers can’t assume that their firms are sitting pretty when it comes to complying with the myriad regulations that govern import transactions. After all, many companies are running on barebones staff, which can mean that pesky details, like import regs, get pushed to the back burner. “Companies are focused on what they absolutely have to do to get goods into the country, versus looking at overall compliance Big Fat Finance,” says Susan Pomerantz, vice president of global trade management consulting with JPMorgan.




Another misperception: the idea that once Customs releases the shipment, you’re sitting pretty. Not so fast, Pomerantz says. A number of regulations, such as those relating to product safety, can come into play even after the goods have been released. “These absolutely carry past the time of Customs clearance,” she says. Any violations can be subject to recalls and penalties.

To avoid such punishment, financial executives need a good grasp of import regulation. Pomerantz outlines several misperceptions that can get in the way. First is the notion that the firm’s exposure is limited to the duties imposed by Customs. Often, these seem immaterial. As a result, double-checking the documentation may seem like a pointless clerical exercise. However, a firm’s exposure for many goods actually is the value of the imported goods, plus the duty. It pays to make sure this info is accurate.

That could prove troublesome down the road, Pomerantz adds. If U.S. Customs and Border Protection later determines that a company hasn’t fulfilled its obligation to document and manage the transaction, the agency could take any of several unwanted actions, she says. These include yanking the firm’s trade privileges, boosting the frequency of audits and inspections, or zapping the firm with penalties. None are any fun.

It’s easy to think that if your firm’s broker is keeping records of the import transactions, your firm itself is off the hook. That may work for a while Big Fat Finance, but you’re taking a chance, Pomerantz says. If the broker goes out of business or is acquired, your firm will need to get the backup records. It’s the same story if your firm decides to switch to another brokerage firm. The upshot? Your firm needs to retain copies of all records related to the transactions for five years. That includes everything from the purchase order to verification of payment.

At the same time, some importers assume that because their seller handled the invoice, they are not responsible for errors. Guess again. “It’s the importer’s responsibility to make sure documents are accurate,” Pomerantz says. A case in point: Under most free trade agreements, the importer typically is responsible for proving that the goods qualify under the agreement and for providing records to support this. That’s difficult to do, since few manufacturers in their right minds will release information that shows how the good qualify. So, many importers roll the dice and assume they’ll be able to get the backup if they’re ever questioned. That may work … or it may not. If it doesn’t, the penalties can be tough.


A little work up front can save time and money down the road.


Related:

Four Forces That Will Shape IT Economics in 2012



In a recent Wall Street and Technology piece Rubin elaborated on the four forces that he sees shaping IT economics this coming year. wiredFINANCE summarizes the piece below:

While many CFOs are focusing on issues like global market uncertainty and regulation, there should not ignore the underlying infrastructure. This infrastructure consists of the business technology engines that drive operational efficiency, enable product innovation and deployment, and enact risk/regulatory compliance. These are the applications and information systems your organization relies on every day.

His advice: focus on value production, risk management, and cost optimization simultaneously and continuously; recognize the role of data and technology in managing costs and opportunities; place extreme value on talent; foster incubation of innovation; and tune fixed versus variable costs to the performance profile of the enterprise and adjust staff distribution appropriately.


Howard Rubin formulated Rubin’s Law, a corollary of Moore’s Law, which has driven IT economics for two decades or more. Moore’s law states that the number of circuits packed onto a silicon chip doubles every 18-24 months and has been driving the ability of the IT industry to deliver more capability for the same or less money. Moore’s law was good for business; Rubin’s law is not.



Rubin’s lesson: you can’t slash IT out of the recession. You’ll need more IT than ever to compete in the new economy.

2. Upward IT Spending Pressure—the business is demanding more and different IT services. This entails more hardware and software and people to administer it all. As Rubin notes: we now have entered an era in which upward technology pressure (and expense) will continue to increase in all business scenarios, regardless of whether revenue increases, decreases, or stagnates. While automation, shared cloud infrastructure, new technologies like thin provisioning, and other techniques can delay the inevitable Big Fat Finance, eventually IT spending must rise.


4. Transitioning Business Models/Need for Urgent Change–current models by which companies manage the dynamics of their technology economics are failing, according to Rubin. Companies are spending the bulk of their IT budgets just keeping systems running with little left to address new business needs and opportunities.

3. Adapting Management Models to the New Technology Economy—after five years of cutting back, delaying investment, doing more with less there are very few cost-cutting tricks left. The pent up demand for more and better technology cannot be put off much longer. Nor will organizations want to if they expect to stay competitive. Their competitors are opening new channels to customers through social media, finding new ways to deliver products and services through the cloud, and uncovering insights that lead to innovation and competitive advantage through collaboration, analytics, and big data.

Since the economic upheaval began in 2007-8, it has been clear that the usual ways of managing technology and the related economics don’t fit with the emerging post-recession economy. Rubin goes on to identify four forces that will impact the economics of IT in your organization.


1. Relentless Demand for Technology—your organization probably is using more technology services and capabilities; more apps, more storage, more email, more users and user accounts Big Fat Finance, smartphones. As Rubin puts it: the demand for computing in the form of processors, storage, network bandwidth, access via alternate devices is growing faster than the global economy and faster than can be offset by Moore’s Law.



Rubin’s law examines the demand for computing power, stating: The geometric growth rate of computing demand—technology intensity in the context of business and our personal lives—will drive computing costs past the point at which Moore’s Law will keep the costs manageable. What this means is that discontinuous/disruptive technology and innovation are critical to the new economics we’re are about to encounter. In short, business needs technology innovation now.

Related:

A Look at the Tax Proposals in Obama’s Budget



In the United States, when the President submits his (and some day, her) budget to Congress, the focus often is on spending. The budget allows the President to outline the Administration’s priorities, deciding how much to allocate to Big Fat Finance, say, defense or social programs. Still, contained in the budget President Obama submitted to Congress earlier this week were a number of provisions geared to the revenue, or tax, side of the equation.

Read the full article at Businessfinancemag.com.

For starters, the President’s budget message outlined his belief that those who earn more should pay a higher rate of taxes. “It is wrong for Warren Buffett’s secretary to pay a higher tax rate than Warren Buffett. This is not about class warfare; this is about the Nation’s welfare.” With this in mind, the President stated his intention to push for the expiration of tax cuts for families making more than $250,000 a year, as well as to more generous estate tax provisions than were in place in 2009. “These policies were unfair and unaffordable when they were passed Big Fat Finance, and they remain so today,” he said.

Related:

GRC Needs Improvement



That doesn’t sound terribly effective or efficient.

The post-Sarbanes need was for a more disciplined, efficient, effective, and, in most cases, centralized approach to managing these corporate functions.

The need still exists today: 73 percent of respondents to an Ernst & Young risk survey indicate that their enterprises maintain seven or more risk functions. Sixty-seven percent of these respondents report that they have overlapping coverage among two or more risk functions; additionally, half of the respondents acknowledge that there are gaps in their organization’s risk coverage.


“Risk management functions within an organization often exist in silos that are disconnected from one another and the wider business strategy,” reports Gerry Dixon Big Fat Finance, Ernst & Young global risk leader. “As a result Big Fat Finance, risks identified in one area may not be communicated or recognized by another. Moreover, different areas within an organization may have different views on the severity or importance of certain risks.”


The term “governance risk management and compliance (GRC)” entered the business vernacular as a result of a specific need in the wake of the “most sweeping U.S. regulatory reform since the formation of the Securities and Exchange Commission.”

For a clearer view on the severity of “silo-ed GRC,” here are additional survey results. ###


Related:

Companies Reassess Their Banking Partners


In addition to worries over these banks’ stability, there’s some concern that banks that accepted government dollars may be pressured to use that money within their home countries. To be sure, banks that want to remain a player in a global economy can’t simply pull back on existing agreements. But, political pressure could affect their future lending. “It’s a legitimate concern,” Colon notes.

Some of the banks that participated in the government bailout may now find a few of their own customers bailing. Respondents to a recent survey by research firm Greenwich Associates indicated they are looking at adding new banks to their roster of financial partners, given their concerns about the stability of the more troubled banks. That’s particularly true for banks that accepted government guarantees and capitalization. “We’re seeing more of a shift toward those banks with positive reputations,” says John Colon, managing director with Greenwich.


However, it’s unclear to what extent even firms that want to move truly will be able to switch their business. Large, multinational organizations need the breadth and depth of products and services that large, multinational banks provide, Colon notes. “They’re not just going to walk away.” More likely Economics, these firms will look to the regional institutions to supplement their current group of banking partners.

Among the banks most likely to benefit in the U.S. are some regional players, like PNC and U.S. Bancorp, that (at least so far) haven’t been too terribly caught up in the financial crisis. Another winner is JPMorgan, whose reputation actually got a boost as a result of its actions over the past year, including its takeover of Bear Stearns.


It’s also hard to discern which of the regional banks made conscious decisions to avoid the lines of business that now are causing so much trouble, and which simply lacked the scale to make significant inroads into these businesses in the first place. Whether it was by design or luck, or some combination of the two Economics, these banks should see their corporate business pick up. ###


Related:

Political Risk Exposure on the Rise

Big Fat Finance

Bloomberg Businessweek reported this week that global insurance broker Marsh & McLennan predicts an immediate increase of 10 to 15 percent for companies seeking political risk coverage. Add to that the inflationary risk associated with rising oil and commodity prices, and the impact of events like those in Egypt are particularly unsettling. Now is the time for companies to review their potential exposures and seek the appropriate hedging and risk mitigation strategies. ###

With so much focus here in the U.S. on the political impacts of new regulations and deficit spending Big Fat Finance, it is easy to lose sight of the emerging political risks across the globe. However, with the political unrest in Egypt making headlines and bringing an entire economy to its knees, the risks are becoming very real for businesses with interests there as well as in other countries. A clear indicator of the rising risks for companies is evident in the potential rise in political risk insurance premiums.

Related:

Schemes can continue for months or even years before they are detected

This spring, accounting firms, consultancies Economics, and professional forensic associations are harvesting seasonal fraud surveys packed with criminal amounts of eye-opening information.


Here are some insights (with statistical support) from the ACFE report Economics, which indicates that companies around the world lose about 5 percent of their annual revenues to fraud:

Fraud schemes are extremely costly. The median loss caused by the occupational fraud cases in the ACFE study was $160,000. Nearly one-quarter of the frauds involved losses of at least $1 million.

Schemes can continue for months or even years before they are detected. The frauds in the study lasted a median of 18 months before being caught.

Tips are key in detecting fraud. Occupational frauds are much more likely to be detected by tip than by any other means. This finding has been consistent since 2002, when the ACFE began tracking data on fraud detection methods.

High-level perpetrators do the most damage. Frauds committed by owners and executives were more than three times as costly as frauds committed by managers and more than nine times as costly as employee frauds. Executive-level frauds also took much longer to detect.

Whether or not North American risk management efforts – and the rise (or rebirth) of formal enterprise risk management programs – are putting a crimp on the corporate fraud crop remains to be seen (and will be something I blog on soon).


Fraud lurks everywhere these days, especially in my inbox.



This ought to plant at least a couple of fraud-management seeds in your mind, including thoughts about the health of your whistle-blowing process. ###

In the meantime, the results from Ernst & Young’s “11th Global Fraud Survey” and The Association of Certified Fraud Examiners (ACFE) “2010 Report to the Nations on Occupational Fraud & Abuse” are available to prying eyes.


Related:

SWIFT for Corporates Continues Growth


As the economy has slowly gained steam, many companies have expressed interest in more effectively managing their banking networks, Blair says. SWIFT for Corporates can be key to doing that. ###


Companies that considered SWIFT at one point but then refrained from moving forward may want to reconsider. Bob Blair, executive director with JPMorgan, identifies several enhancements to the service:


a) The Bank Readiness Certification Program: This “provides a registry so companies can see the level of a bank’s capabilities on SWIFT,” Blair says. According to information on the SWIFT website, a list of banks that meet the criteria will be published beginning in January 2011.

b) Standards development: The standards used within SWIFT continue to evolve, with the latest being ISO 20022. This is a universal financial industry message scheme Big Fat Finance, according to www.iso20022.org. The new standards are “more capable and complex Big Fat Finance,” Blair says. For instance, they can support mixed payables and have a global reach.

In 2010, through November, more than 3.6 billion messages had flowed back and forth through SWIFT’s systems, up about 7 percent from the same period in 2009. Payments made up nearly half the traffic, while securities accounted for 44 percent. While nearly two-thirds of its traffic came from Europe, the Mideast, and Africa, traffic from the Americas grew by 8.5 percent.


SWIFT, the acronym for the Society for Worldwide Interbank Financial Telecommunication, continues its penetration of the corporate marketplace. The number of corporate users topped 600 as of last year. SWIFT works with more than 9,000 banks, securities institutions, and corporate customers in 200-some countries to exchange millions of financial messages.


c) Connectivity options: If your firm lacks the resources to establish a direct connection with SWIFT, it has several options. SWIFT Alliance Lite is a browser-based solution, and often is used as a way to test the service before actually moving to SWIFT. In addition, organizations can work with a service bureau to connect to SWIFT, which reduces the upfront investment. (For more info, check out this earlier post.)


Related:

What Financial Services Execs Teach About Cloud Computing

Lesson 5 focused on finding programmers. Babcock reports one financial industry exec complaining that his most promising prospects repeatedly turned down job offers in favor of working for Oracle, Google, Amazon.com Big Fat Finance, Facebook, and Zynga, an online gaming company. C’mon, if you’re a talented 20-something programmer, who would you rather work for?

Lesson 2, as Babcock saw it, is the lack of ROI from cloud computing. Several fund managers noted that they hadn’t yet seen a prospective ROI on an investment in cloud computing, whether public cloud or private. They have little use for public cloud computing, and worries about maintaining the security of their accounts is only part of it. Of course, big financial services firms have suffered numerous costly security breaches even without using the cloud. Good security is a matter of careful diligence and has little to do with cloud computing. wiredFINANCE will address cloud ROI in an upcoming piece.


Nothing surprising here; companies have been making these IT mistakes for decades, long before cloud.


Lesson 3 revolves around the challenges of the changing regulatory climate. As one financial manager reportedly said: “the wildcard in market structure evolution is regulation as coming from the Securities and Exchange Commission, the European Infrastructure Market Regulation, and other agencies. Most likely: known security risks and exposures will have to be reported.” Again, this has nothing to do with cloud computing.

The financial services industry is a huge user of information technology, but it has been much slower to jump on the cloud bandwagon. Most of its cloud interest, rather, has revolved around virtualization and private clouds. wiredFINANCE addressed private clouds some months back here.

Kevin Jackson, writing for Forbes Big Fat Finance, focused on eight cloud mistakes here. Check out what the financial industry folks advise about cloud computing here.

Lesson 1 came from Madge Meyer, chief innovation officer and executive VP at State Street Bank. Her big focus is on virtualization, not cloud. At State St. virtualization was a key step toward establishing a highly automated operation to handle cloud provisioning, orchestration, and management. Automation, she concluded, is where the real cloud computing savings come from.

At a recent gathering of financial services executives in Boston the topic turned to cloud computing. Information Week reporter Charles Babcock was there and captured the mood of the gathering, pulling what he considered five cloud lessons, the problematic word here is lesson.





• Lack of formal planning

• Missing or poor IT governance

• Poor or missing responsibility matrix

• Neglecting the human resource management challenges

• No program management office

• Failure to fully inventory of assets

• Lack of oversight

• Inappropriate or lack of a service level agreement (especially with multiple cloud providers)

Forbes, meanwhile, listed the most important factors that have led to cloud transition failures:



Lesson 4 took up IT products. For those building out a data center optimized for virtualization some financial IT managers reported turning to a Cisco-EMC joint initiative called Virtual Computer Environment (VCE). Intel and VMware have also joined the VCE party. VCE is implemented through vBlocks, which are pre-integrated rack mount servers with converged networking and storage built in. vBlocks essentially are appliances. The IBM Workload Deployer is a cloud middleware appliance and IBM’s CloudBurst on Power Systems is a cloud hardware appliance. The cloud appliance market is hot.


Related:

Forget Disaster Recovery—Think Business Resilience and Risk Management


It is important to note that the focus of the study is business resilience; not DR or business continuity or even risk management. Business resilience, according to IBM, refers to the ability of enterprises to adapt to a continuously changing business environment, not just to restore operations after a disaster or continue to function despite operational problems.

More study findings, for example Big Fat Finance, show a solid majority of respondents (60%) saying that business resilience is considered a joint responsibility of all C-level executives although CIOs and IT professionals remain key players in building a more resilient organization. Similarly, a significant majority of survey respondents (85%) say that data and application security, data protection (79%) Big Fat Finance, infrastructure security (77%), security governance (75%), identity and access management (74%), and compliance management (69%) now are part of their organization’s broader risk management strategy.

If your organization is not yet engaged in business resilience risk planning now is a good time to start. Begin by downloading the report noted above. From there, IBM recommends assigning an enterprise-wide risk management team with a strong mandate to reach out across the organization because risk management should be part of everybody’s job.


Of course business resilience helps organizations maintain continuous operations in the face of disruptions and disasters. But IBM envisions it as something more. IBM distinguishes business resilience planning from enterprise risk management (ERM) in that it is more directed to build the organizational capacity to seize opportunities created by unexpected events. As such, it requires the engagement of everyone in the organization and often means a change in corporate culture to instill awareness not only of risk but of potential opportunities.


Companies would pay even less attention to disaster recovery than they do now if auditors and other compliance police didn’t get on their cases or threaten them with fines or liability of various sorts. Disaster recovery (DR) alone, however, may not be sufficient.


Among the findings of the study: organizations are diversifying their strategies to build business resilience, while keeping continuity, IT, and compliance risks in the forefront. And increasingly in business resilience strategies cloud computing is quickly emerging as a key risk and opportunity management tool.

The study makes it clear that DR and business continuity is evolving into enterprise-wide risk management. Such risk management, the study notes, should involve everyone in the organization and imbue responsibility for risk management at every level if companies are to respond effectively to changes and unexpected events.




The addition of opportunities to the risk management calculus adds a new dimension. Now it is not just about restoring servers in response to a sudden disaster but to bring back the right capabilities and capacity to take advantage of new opportunities that may emerge from the events.



Based on its 2011 Global Business Resilience and Risk Study IBM is suggesting a more proactive and forward thinking approach to DR, one that encompasses opportunities as well as risks. The study is available here.

Business resilience, of course, continues to involve the CIO and IT because, in the end, it is about the protection and accessibility of the organization’s applications, systems, and data assets. However, 62% of respondents also noted they brought onboard other C-level executives and 44% even include Board Members.

A focus on business resilience with the goal of not only ensuring business operations continue but exploiting sudden new opportunities will require new investments and the involvement of new players across the company. For example, 58% of the respondents reported investing in new risk-related IT strategies, such as cloud computing with its ability to rapidly deploy new resources and capabilities.

Related:

The Next XBRL Deadline Nears






• Allow enough time: That’s particularly true for companies that are reporting their annual Form 10-Ks in XBRL for the first time. Given the greater complexity of the report when compared to a 10-Q, extra time and manpower typically is needed to get the job done accurately.

• creating a new element specific to the company when an existing element within the U.S. GAAP taxonomy would have done the job, and

XBRL, or eXtensible Business Reporting Language, is a means of tagging data so that it can be manipulated and moved, yet still be readily identified. Say a company’s sales for the quarter were $5 million. That number is tagged so that no matter how the financial statements are sliced and diced, it’s clear that this particular $5 million figure refers to quarterly sales.

In just under a month – June 15, to be exact – another 1,200-some public companies will submit interactive, or XBRL-tagged, financial statements to the SEC. They follow the companies that made the jump a year ago Economics, which were those with a worldwide public equity float of at least $5 billion. This year, “all other domestic and foreign large accelerated filers using U.S. GAAP will be subject to the same interactive reporting requirement,” according to the SEC. Then, a year from now, smaller reporting companies and foreign private issuers that prepare their financial statements according to IFRS will follow.


• entering an amount with an incorrect sign or rounding.



While getting up and running with XBRL involves a learning curve, most companies have found that the process isn’t as arduous as they feared, Fragnito says. “This isn’t the Sarbanes-Oxley that some made it out to be,” he says, referring to the massive implementation efforts many companies endured in order to comply with the 2002 regulation. ###

The following steps can help your firm to avoid these mistakes and make the conversion to XBRL as smooth as possible:

• View this as a strategic initiative: “This is not just an IT project,” says Fragnito. XBRL can aid internal reporting just as much as it helps the reporting required for various government entities. For instance, acquisitive companies are finding that they can use XBRL to move data between different systems and entities without building expensive new frameworks Economics, he adds.

• Implement XBRL as early in the reporting process as possible. Creating financial statements in an ERP system only to redo them using XBRL doesn’t make a whole lot of sense. “Our vision is for companies to implement XBRL further into the reporting process,” Fragnito notes. That boosts efficiency and enhances execs’ abilities to make well-informed decisions.


• selecting an element from the U.S. GAAP taxonomy that was either too broad or too narrow for the line item


Even so, gaining the benefits of XBRL isn’t a slam-dunk. According to this report from KPMG, some of the initial financial statements created in XBRL contained errors, such as missing or inaccurate information. Behind most errors were the following mistakes:

The promise of XBRL is clear: it makes assembling, analyzing, and comparing data more efficient, which provides companies more time to ensure the data’s accuracy. It also can make reporting information to government agencies easier, because the data can be assembled and tagged and then easily moved to create the variety of reports that may be required – without completely re-entering it all. “In XBRL format, it’s easy to reuse and redisplay data,” says Anthony Fragnito, chief executive officer with XBRL International, Inc., a consortium of 550 companies, financial institutions, and government agencies whose goal is to build the XBRL language and promote and support its adoption.


Related:

A Refresher on Recession-Response Risks

Economics

In my previous post Economics, I mentioned the down side of our boom-and-bust cycle and how it often translates to less-than-ideal decision-making atop organizations.

Here’s a Bloomberg BusinessWeek list – “10 Worst Innovation Mistakes in a Recession” — that I bookmarked three years ago and frequently re-read as a reminder to avoid boom-and-bust decision-making atop of my own business. Keep it handy in case 2012 looks like 2008


Related:

Rolling Forecasts are a Good First Step Toward Smarter Financial Planning

Rolling forecasts can make companies’ planning and budgeting more valuable as a business tool. There’s some evidence that there is growing interest in using it, which I find encouraging.


In trying to become leaner over the past three years, the main adaptation that United States corporations have made has been to shed employees and hire them back slowly. Consequently, those remaining have had to work harder but also smarter. In some cases, this has meant simply eliminating less productive tasks or adopting new approaches to make better use of their time. Overall, I’ve noted small but steady improvements in the use of information technology to improve execution, enhance visibility into business conditions and coordinate actions to make up for having fewer people.

The high degree of volatility in business conditions – in particular raw material costs, exchange rates, market prices and even demand – places a premium on having the agility to adapt successfully to these ever changing conditions. And, indeed, companies that spent a couple of months in the fall of 2008 carefully preparing their budgets discovered early in 2009 that the assumptions they had made were useless for running the business. They found the same was true the following year.

I recently participated in a panel discussion about the rise in the use of rolling forecasts in corporate planning. I’m not surprised by this trend; I have encouraged it. Ever since the financial crisis started three years ago, I’ve been writing that companies should rethink how they plan and budget to respond to increasing business volatility. Rolling forecasts are useful because they continually extend the formal planning horizon out more than a year rather than having it stop abruptly at the end of a company’s fiscal year. They can be the right first step in improving the effectiveness of a company’s budgeting process Economics, but ultimately I believe that organizations need to adopt a better approach to planning – what I refer to as integrated business planning. Moreover, companies that want to adopt a rolling forecast approach must first make important changes to their planning and budgeting processes to make them leaner, more focused and faster.

This may turn out to be the easy part of adapting to fluctuating business conditions. The tepid recovery of many of the developed world’s economies has validated the basic decision to cut operations to the bone and to operate in as lean a fashion as possible. The sluggishness of business activity in many of these countries has caused lingering expectations of a double-dip recession. For the moment, staying lean in this environment is the least risky approach. But it also can mask the need to do something significant about planning. What happens when demand picks up and companies have to balance their need to meet growing demand with a desire to minimize the risk of becoming overextended? They need to have a better planning and budgeting process in place.


To support rolling forecasting Economics, driver-based modeling can speed up reforecasting cycles and facilitate lean planning. Driver-based modeling recognizes explicit input and output relationships. Inputs include not just direct items such as materials and labor but also indirect resources such as the number of sales calls (and related expenses) needed to sell the forecasted units. Effective driver-based forecasting models incorporate a units-times-rate structure, keeping “things” such as units sold, headcount, and tons of steel and boxes separate from the monetary value of the units. Thus, as prices change it’s possible to quickly incorporate these into the model. Having a driver-based model makes it easier and faster to incorporate changes to the plan as the impact of changes in forecast sales volumes, for example, will immediately ripple through and show the resources (and change in resources from the last forecast) needed to achieve the forecast.


The traditional annual budget has served as the closest thing to integrated business planning. A great deal of formal and informal planning goes on in a company, but it tends to be done in local entities and/or functional silos. There are sales forecasts, shop floor plans, headcount plans and others. However, these are only imperfectly integrated into the budget. Indeed, our research shows that the sales plan, which is a major driver of the annual budget, is typically two months out of date by the start of the fiscal year. Not surprisingly, therefore, our benchmark research also finds that fewer than one in 10 companies (9%) react to major change in their operating environment in a well-coordinated fashion. More than one-third (36%) admit they are uncoordinated and more than half (54%) say they are “somewhat coordinated.” Although that might sound innocuous, like a “somewhat coordinated” juggler, corporations that fall short in coordination wind up dropping a lot of balls, and this inaccuracy has an impact on the bottom line. Indeed, more than one-third (37%) of these companies allow that lack of coordination occurs frequently and they spend a lot of time and effort dealing with it.





Rolling forecasts do not address the need to integrate business planning, but using them requires a lean planning approach, which is a foundation for integrated business planning that we have researched thoroughly. But I have to caution CFOs and controllers when it comes to adopting rolling quarters: Companies that try to replicate a multiple-month budgeting process four times a year will quickly (and correctly) conclude that it’s not worth the effort. Therefore, at a minimum, the rolling forecast must incorporate only the relatively short list of specific items that are key to managing the business.

Related:

Seven IT Standards for CFOs

(2) XBRL (eXtensible Business Reporting Language) — an open data standard for financial reporting based on XML. It is what enables automated systems to find and extract data from business reports based on its semantic meaning, saving you from having to read through every footnote looking for a specific type of information.



(6) AES (Advanced Encryption Standard) — is the encryption standard adopted by the U.S. government. The standard comprises three block ciphers, AES-128, AES-192 Big Fat Finance, and AES-256. The number refers to the number of bits used by the encryption algorithm. The higher the number, the more resistant the encryption is to codebreaking.

(3) EDI (Electronic Data Interchange) — the computer-to-computer exchange of structured information using agreed-upon message standards. It is the foundation of B2B ecommerce.


A punch line to a joke in the IT industry is this: The nice thing about standards is that there are so many of them. OK, it’s not much of a joke. The point is that standards work best when there are only a few. In the IT industry, however, there already is an abundance of standards — and more keep arriving.




(7) SQL (Structured Query Language) — although CFOs probably don’t directly use SQL, it is the standard query language behind many of the query and reporting tools that they do use. It is the English-like programming language used to extract information from relational databases.


(5) PKI (Public Key Infrastructure) — a computer security architecture based on a message encryption process using two keys to scramble the data and render it unintelligible; one key is public and used by the sender to encrypt the message; the other is private and used by the recipient to decrypt the message. It offers a flexible way to exchange information over public networks while ensuring that the information remains private.

Feel welcome to add any that wiredFINANCE has missed. ###

There even is an abundance of standards bodies. American National Standards Institute (ANSI), International Organization for Standards (ISO), the IEEE Standards Association, and World Wide Web Consortium (W3C) are just the start. The Federal Information Processing Standards (FIPS) adds even more standards.

(4) PCI (Payment Card Industry) — a set of standard security processes insisted upon by the credit card providers to ensure the protection of cardholder information by every organization that touches credit card data. Failure to comply with PCI standards can result in serious penalties and substantial legal liability.



The CFO, however Big Fat Finance, is interested only in a few. Below, wiredFINANCE will hit on seven IT standards that impact the finance operation while skipping the ones so totally generic that you live with them every day and may not even realize it, like TCP/IP and HTTP, which underlie the Internet. Feel welcome to add any missed below in the comments section.

(1) XML (eXtensible Markup Language) — has emerged as the lingua franca of IT. Created and managed by the W3C, it provides a generic format that facilitates the exchange of information from a wide variety of otherwise incompatible structural formats. XML is what allows you to load data from your customer’s incompatible system into your system and get meaningful results rather than crash your system.

Related:

U.S. Senate Debates the Value of Tax Extenders


Moreover, deciding whether to renew the extenders Economics, and if so, how to cover the income foregone, takes Congress away from other issues. And, giving deductions or credits to certain groups of taxpayers means that taxes elsewhere have to go up, or expenditures down. Then, there’s a philosophical argument: “If a provision is worthy of being in the tax code, then it generally should be made permanent,” Senator Orrin Hatch (R-Utah), also a ranking member Economics, stated.

Read the full article at Businessfinancemag.com.


Tax extenders — the term given to provisions in the tax code that, if not renewed every year or two, will expire — were the subject of a recent hearing of the U.S. Senate Finance Committee.


More than 130 extenders, from the R&D credit to deductions for state and local taxes, currently clutter the tax code, Senator Max Baucus (D-Mont.), ranking member of the Committee, said at the hearing. The uncertainty they create causes hassles for both homeowners and businesses. “For businesses to succeed, Congress must provide a stable and certain tax code,” Baucus said.

Related:

How to Maximize Internal Audit’s Value

How to Maximize Internal Audit’s Value: Economics PwC guidance in a nifty eight-slide presentation;



SEC Considers Climate Disclosures: Possible guidance coming on disclosure requirements related to climate change. ###


Survey on IFRS Conversion Pace: 60 percent agree with current pace or say “not fast enough”;

Here are several quick-hit GRC topics that did not make it into my posts during the past two weeks:

Related:

Risk Management “Fuels” Sustainability


“Without effective fire protection systems,” the report states Economics, “the risk of fire increases the carbon emissions by [1 to 2 percent] over the life cycle of a standard building, and can add up to 14 percent to the carbon emissions over the lifetime of a facility exposed to extensive fire hazards.”

This is fascinating information to factor into a more precise analysis of risk-management investments.

In addition to demonstrating a promising link between traditional risk management and innovative corporate sustainability, the report also marks the first time in FM Global’s 174-year history that the company is making its technical research available directly to the public. Previously, all of FM Global’s technical research was considered proprietary information and kept confidential. ###

I’ve been researching the integration of risk management and performance management recently. Well, here’s another interesting, valuable, and scientifically validated disciplinary intersection: risk management and corporate sustainability.

A new report from FM Global, “The Influence of Risk Factors on Sustainable Development,” contains research that shows how certain risk management systems (e.g., fire protection systems within large industrial buildings) can (over the lifetime of these buildings) help reduce carbon emissions.

The report is available Economics, in exchange for your contact information, here.

Related:

Facebook IPO Makes Social Commerce Real





Even before the Facebook IPO companies began capitalizing on social business. wiredFINANCE covered it here. The hoopla around the Facebook IPO will intensify the focus on social commerce no matter how the stock does.

The initial kick came last year with a Booz and Company study here. In that study Booz reports that one-third of companies already had a senior executive who is responsible for social media company-wide. Among companies that consider themselves best-in-class the figure jumped to 41%. What will the mega Facebook IPO do?

In the Booz study social media emerged as a CEO-level agenda item for many companies. If it hasn’t at your organization, it probably will in 2012 as organizations everywhere scramble to 1) figure out what it is all about, 2) identify a strategy for getting involved, and 3) plant a stake somewhere in social commerce. Social media is not just about communications and entertainment anymore.

The highly anticipated Facebook IPO will have ramifications for all manner of businesses and how they operate. Just as the Google IPO in 2004 changed the search engine business, catapulting search engines into a major marketing driver and changing how companies spend for advertising and evaluate its effectiveness, Facebook’s IPO will confirm the arrival of social networking as a force in business.

With social commerce the big consumer product companies are moving in fast. Here they want to build up a fan base of followers and collect sought after Like clicks. Some are adding rudimentary transaction links so followers and visitors can actually buy something. These are rudimentary because they redirect the buyer to a commerce site elsewhere. But ideally, you want to do your commerce right there on the social website, on Facebook, and make your commerce part of the social community itself.




One promising approach focuses on using the rewards and incentives of a loyalty program to drive the business message viral on social media. The trick will be to find the right message, tone, style, and attitude. Expect to see much experimentation and innovation around the challenge of integrating commerce and social business, possibly through gamification.

This in not unlike what companies went through when the World Wide Web first emerged. Companies were wary at first, slowly establishing websites that resembled little more than electronic brochures. Then e-commerce took hold big. Today even the smallest pizza joint has a website, and B2B and B2C web commerce is pervasive.

Gartner already suggests gamification will be big, with more than 70% of Global 2000 organizations having at least one gamified application by 2014 Economics,. wiredFINANCE identified gamification in the 2012 trends piece here Gamification plays right into social commerce.

Facebook Economics, with over 800 million subscribers and still growing, will drive social commerce and change business as did Google before. As with the emergence of the Web and of Google and with serious money in play Finance won’t want to sit on the social commerce sidelines for long.

Related:

Manage Spending and Procurement as One

For Coupa’s own customers, such an index may have more value. They can compare their actual data to the benchmark. Coupa promises to present 40 different benchmarks, most of which are transactional or operational. Companies doing procurement and spend management with more conventional systems, however, may have difficulty extracting the necessary comparative data.



Actually, a number of vendors focus on e-procurement, and others focus on spend management. Here, Gartner identifies 11 vendors. Coupa is trying to break out of the pack.

In most organizations, spending and procurement are different disciplines managed by different people. In the end, however, they are about the same thing — spending the organization’s money. So it was only a matter of time before someone would combine the two. That’s what Coupa Software, a cloud-based SaaS spend management player, did.

Eventually, the software provider hopes that its reported data becomes a widely accepted financial index of corporate spending across all categories — essentially, a look into spending behaviors that previously hasn’t been available. Of course, the data is only useful to the extent that Coupa’s customers represent a meaningful segment of the market at large. Still, it may be useful to see how your numbers stack up.

2. Categories showing increased spending: Telco & Internet Service, Technology Services Big Fat Finance, and MRO & Manufacturing exhibited greatest increases.




One of the side benefits of having spend/procurement management together as a cloud service is that it provides an opportunity to peek at the customer activity in aggregate to spot trends (given that the right security protections are put in place from the start, of course). SaaS vendors Big Fat Finance, if they are smart, look at what their clients do with the software all the time as they plan enhancements and fixes. What’s less usual is publishing what they see, even in aggregated form. Still, Coupa’s latest numbers are interesting.

Coupa, for example, reports that median spending increased 5.03% last quarter. Four more interesting data points:


Maybe the most useful thing that Coupa does is not benchmarking but combining procurement and spend management, which should give the CFO better visibility into both. If nothing else, instead of sitting through two meetings to get data, it will take only one meeting — or none. At best, it will help the CFO identify and possibly resolve problems through a single online screen with just a couple of clicks of the mouse. ###

1. Time saved on purchase cycles: Average purchase request approval cycle time dropped from 19.4 hours to 17.1 hours.

4. Payment terms held steady: Virtually no change in average payment terms, from 31.87 days to 31.79 days.

3. Categories showing decreased spending: IT Hardware, Marketing Services, and Transportation Services were the biggest losers.



There is no shortage of SaaS procurement and spend management players. Gartner identifies Verian Technologies, SciQuest, Puridium, Proactis, PurchasingNet, Periscope, Perfect Commerce, Ketera Technologies, ePlus, Elcom, BirchStreet Systems, Basware, Ariba, Epicor, SAP, Unit 4 Agresso, Lawson, Oracle, COA, Quadrem, Hubwoo, and COA, along with Coupa.



Related:

loophole



Those who focus exclusively on tax rates tend to forget that two factors determine how much tax a corporation pays: taxable income and rates. A low rate can result in a high effective tax if all revenue is taxable; on the other hand, a high rate can result in low taxes if there are numerous deductions from revenue to determine taxable income. The U.S. is an example of the latter approach: relatively high rates and a relatively narrow base. The narrow base results from Congress’s using the tax code to incentivize certain behaviors such as home ownership, borrowing money, capital expenditures, R&D, retirement savings, etc.

To call these “loopholes” overstates the case. They are not “ambiguities or omissions”—rather, they are intentional subsidies for favored activities. That is why it is so difficult to modify our system of taxation: each “tax expenditure” has a vocal constituency who benefits from it and will fight ferociously to keep it. In theory, all of the deficit reduction commissions so far agree that reducing tax rates and broadening the tax base would benefit the economy; in practice, Congress cannot agree to do it. Perhaps the current congressional “Super Committee” will be different?


A recent study released by the left-leaning Citizens for Tax Justice and the Institute on Taxation and Economic Policy has created extensive public comment. The study found that many of the most profitable companies in the United States—General Electric, Boeing, DuPont, Wells Fargo, Verizon, etc.—paid little or no corporate income tax over the three-year period 2008-10. In fact, 78 of the 280 companies studied paid zero tax or less in at least one of the three years. The average effective tax rate for all 280 companies studied for the period was 18.5%—slightly over half of the 35% statutory tax rate, and 6.1% less than the effective tax rate that companies with overseas operations paid on their foreign profits. The study blamed “corporate tax subsidies” or “loopholes” such as accelerated depreciation Big Fat Finance, stock options, industry-specific tax breaks, and deferral of foreign income for this result.

Any tax advice contained herein was not intended or written to be used, and it cannot be used by any taxpayer, for the purpose of avoiding U.S. federal, state, or local tax penalties.



On the other (right) hand Big Fat Finance, however, there are those who believe that U.S. corporate tax rates need to be reduced significantly: the U.S. statutory rate of 35% is the second highest (behind Japan) in the world, even ignoring the effects of state and local income taxes in increasing the statutory rate even further. In this view, the combination of high rates, worldwide taxation, and a competitive global marketplace makes the U.S. corporate tax system punitive.

Loophole:

1 a: a small opening through which small arms may be fired

b: a similar opening to admit light and air or to permit observation

2: a means of escape; especially: an ambiguity or omission in the text through which the intent of a statute, contract, or obligation may be evaded


Example: She took advantage of a loophole in the tax law.

—Merriam Webster Dictionary

Related:

Sharing Information Online — It’s All About Security

• Strong encryption (Advanced AES, 256-bit) for documents, both while they are there and when they are in transit

The problem: Online collaboration space is available at many Web sites Economics, ranging from ElephantDrive and 4shared at the low end to IntraLinks and Brainloop at the high end, with dozens more in between. You will have to look closely at what each site offers in terms of features, particularly security. Just having a password-protected log-in is absolutely NOT sufficient security for important sensitive corporate information.

How do you share information with third parties during due diligence? In the past, companies set up a secure physical data room and all the parties came with their cartons of documents. Even today, many ship data CDs via FedEx. Most exchange email with attachments among the various parties. Email attachments are utterly insecure in most cases.

That might not sound reassuring to CFOs faced with sharing sensitive information during the M&A process, compliance audits Economics, and such. Data theft more than identify theft is their immediate concern. Nobody wants their financials, business strategies, customer lists, unannounced product specs, or anything else compromised.


• Traceable audit trails and watermarking of documents

• Strong authentication (two-token authentication)

Once you have been assured that the online collaborative space is secure and under your control, the advantages of an online data room for due diligence, M&A, and similar activities involving sensitive documents make it a no-brainer. You may never ship data CDs overnight via FedEx or send attachments unencrypted again. ###

• Digital rights management to control documents wherever they end up

These features represent the acceptable bare minimum. You will have to grill each online vendor thoroughly and press for the details on how each implements security and the level of control you have. In short, you want the vendor to maintain a highly reliable online space while you and only you control what happens to the documents within that space.


What kind of security should you look for? At a minimum, you want:





• Operator shielding, which prevents the online vendor from accessing the data or the encryption keys


Accellion, Inc., which provides secure managed file transfer, avoids the dangers of sending email attachments in the clear over the Internet. Its latest survey found customers coming to it because of concerns about security, email attachment size limitations, and frustration over FTP complexities.





However, SaaS and cloud computing are combining to deliver what may yet prove to be a faster, easier, and cheaper solution that can provide near bullet-proof security. These services often are referred to as online collaboration space or online data rooms.


EURIM, the European Information Society Group, published a report some years ago touting online document sharing as the wave of the future, even for confidential information. EURIM’s focus primarily was on government agencies. One of its conclusions: “Advances in computing, particularly in networking and security, are making it possible to reduce identity theft through secure data sharing.”


Related: