diff options
author | rguenth <rguenth@138bc75d-0d04-0410-961f-82ee72b054a4> | 2009-04-03 10:24:28 +0000 |
---|---|---|
committer | rguenth <rguenth@138bc75d-0d04-0410-961f-82ee72b054a4> | 2009-04-03 10:24:28 +0000 |
commit | dd277d48c6583b9ac3a360761cf4484f021c9f0b (patch) | |
tree | c6c1826ba3cd971c15d66cb4b498f99ad646ed70 /gcc/tree-ssa-operands.c | |
parent | 5c5d5311fbca9e18422cd2540973ffb3f39020bb (diff) | |
download | gcc-dd277d48c6583b9ac3a360761cf4484f021c9f0b.tar.gz |
2009-04-03 Richard Guenther <rguenther@suse.de>
PR middle-end/13146
PR tree-optimization/23940
PR tree-optimization/33237
PR middle-end/33974
PR middle-end/34093
PR tree-optimization/36201
PR tree-optimization/36230
PR tree-optimization/38049
PR tree-optimization/38207
PR tree-optimization/38230
PR tree-optimization/38301
PR tree-optimization/38585
PR middle-end/38895
PR tree-optimization/38985
PR tree-optimization/39299
* tree-ssa-structalias.h: Remove.
* tree-ssa-operands.h (NULL_USE_OPERAND_P): Make of type use_operand_p.
(NULL_DEF_OPERAND_P): Make of type def_operand_p.
(struct vuse_element_d): Remove.
(struct vuse_vec_d): Likewise.
(VUSE_VECT_NUM_ELEM, VUSE_VECT_ELEMENT_NC, VUSE_ELEMENT_PTR_NC,
VUSE_ELEMENT_VAR_NC, VUSE_VECT_ELEMENT, VUSE_ELEMENT_PTR,
SET_VUSE_VECT_ELEMENT, SET_VUSE_ELEMENT_VAR, SET_VUSE_ELEMENT_PTR,
VUSE_ELEMENT_VAR): Likewise.
(struct voptype_d): Likewise.
(NUM_VOP_FREE_BUCKETS): Likewise.
(struct ssa_operands): Remove vop_free_buckets and mpt_table fields.
(struct stmt_operands_d): Remove.
(VUSE_OP_PTR, VUSE_OP, SET_VUSE_OP, VUSE_NUM, VUSE_VECT,
VDEF_RESULT_PTR, VDEF_RESULT, VDEF_OP_PTR, VDEF_OP, SET_VDEF_OP,
VDEF_NUM, VDEF_VECT): Likewise.
(copy_virtual_operands): Remove.
(operand_build_cmp): Likewise.
(create_ssa_artificial_load_stmt): Likewise.
(enum ssa_op_iter_type): Remove ssa_op_iter_vdef.
(struct ssa_operand_iterator_d): Remove vuses, vdefs, mayusesm
vuse_index and mayuse_index members. Pack and move done and iter_type
members to the front.
(SSA_OP_VMAYUSE): Remove.
(SSA_OP_VIRTUAL_USES): Adjust.
(FOR_EACH_SSA_VDEF_OPERAND): Remove.
(unlink_stmt_vdef): Declare.
(add_to_addressable_set): Remove.
* tree-vrp.c (stmt_interesting_for_vrp): Adjust.
(vrp_visit_stmt): Likewise.
* doc/tree-ssa.texi (Alias analysis): Update.
* doc/invoke.texi (max-aliased-vops): Remove docs.
(avg-aliased-vops): Likewise.
* tree-into-ssa.c (syms_to_rename): Remove.
(need_to_update_vops_p): Likewise.
(need_to_initialize_update_ssa_p): Rename to ...
(update_ssa_initialized_fn): ... this. Track function we are
initialized for.
(symbol_marked_for_renaming): Simplify.
(add_new_name_mapping): Do not set need_to_update_vops_p.
(dump_currdefs): Use SYMS_TO_RENAME.
(rewrite_update_stmt): Always walk all uses/defs.
(dump_update_ssa): Adjust.
(init_update_ssa): Take function argument. Track what we are
initialized for.
(delete_update_ssa): Reset SYMS_TO_RENAME and update_ssa_initialized_fn.
(create_new_def_for): Initialize for cfun, assert we are initialized
for cfun.
(mark_sym_for_renaming): Simplify.
(mark_set_for_renaming): Do not initialize update-ssa.
(need_ssa_update_p): Simplify. Take function argument.
(name_mappings_registered_p): Assert we ask for the correct function.
(name_registered_for_update_p): Likewise.
(ssa_names_to_replace): Likewise.
(release_ssa_name_after_update_ssa): Likewise.
(update_ssa): Likewise. Use SYMS_TO_RENAME.
(dump_decl_set): Do not print a newline.
(debug_decl_set): Do it here.
(dump_update_ssa): And here.
* tree-ssa-loop-im.c (move_computations): Adjust.
(movement_possibility): Likewise.
(determine_max_movement): Likewise.
(gather_mem_refs_stmt): Likewise.
* tree-dump.c (dequeue_and_dump): Do not handle SYMBOL_MEMORY_TAG
or NAME_MEMORY_TAG.
* tree-complex.c (update_all_vops): Remove.
(expand_complex_move): Adjust.
* tree-ssa-loop-niter.c (chain_of_csts_start): Use NULL_TREE.
Simplify test for memory referencing statement. Exclude
non-invariant ADDR_EXPRs.
* tree-pretty-print.c (dump_generic_node): Do not handle memory tags.
* tree-loop-distribution.c (generate_memset_zero): Adjust.
(rdg_flag_uses): Likewise.
* tree-tailcall.c (suitable_for_tail_opt_p): Remove memory-tag
related code.
(tree_optimize_tail_calls_1): Also split the
edge from the entry block if we have degenerate PHI nodes in
the first basic block.
* tree.c (init_ttree): Remove memory-tag related code.
(tree_code_size): Likewise.
(tree_node_structure): Likewise.
(build7_stat): Re-write to be build6_stat.
* tree.h (MTAG_P, TREE_MEMORY_TAG_CHECK, TMR_TAG): Remove.
(SSA_VAR_P): Adjust.
(struct tree_memory_tag): Remove.
(struct tree_memory_partition_tag): Likewise.
(union tree_node): Adjust.
(build7): Re-write to be build6.
* tree-pass.h (pass_reset_cc_flags): Remove.
(TODO_update_address_taken): New flag.
(pass_simple_dse): Remove.
* ipa-cp.c (ipcp_update_callgraph): Update SSA form.
* params.h (MAX_ALIASED_VOPS): Remove.
(AVG_ALIASED_VOPS): Likewise.
* omp-low.c (expand_omp_taskreg): Update SSA form.
* tree-ssa-dse.c (dse_optimize_stmt): Properly query if the rhs
aliases the lhs in a copy stmt.
* tree-ssa-dse.c (struct address_walk_data): Remove.
(memory_ssa_name_same): Likewise.
(memory_address_same): Likewise.
(get_kill_of_stmt_lhs): Likewise.
(dse_possible_dead_store_p): Simplify, use the oracle. Handle
unused stores. Look through PHI nodes into post-dominated regions.
(dse_optimize_stmt): Simplify. Properly remove stores.
(tree_ssa_dse): Compute dominators.
(execute_simple_dse): Remove.
(pass_simple_dse): Likewise.
* ipa-reference.c (scan_stmt_for_static_refs): Open-code
gimple_loaded_syms and gimple_stored_syms computation.
* toplev.c (dump_memory_report): Dump alias and pta stats.
* tree-ssa-sccvn.c (vn_reference_compute_hash): Simplify.
(vn_reference_eq): Likewise.
(vuses_to_vec, copy_vuses_from_stmt, vdefs_to_vec,
copy_vdefs_from_stmt, shared_lookup_vops, shared_vuses_from_stmt,
valueize_vuses): Remove.
(get_def_ref_stmt_vuses): Simplify. Rename to ...
(get_def_ref_stmt_vuse): ... this.
(vn_reference_lookup_2): New function.
(vn_reference_lookup_pieces): Use walk_non_aliased_vuses for
walking equivalent vuses. Simplify.
(vn_reference_lookup): Likewise.
(vn_reference_insert): Likewise.
(vn_reference_insert_pieces): Likewise.
(visit_reference_op_call): Simplify.
(visit_reference_op_load): Likewise.
(visit_reference_op_store): Likewise.
(init_scc_vn): Remove shared_lookup_vuses initialization.
(free_scc_vn): Remove shared_lookup_vuses freeing.
(sort_vuses, sort_vuses_heap): Remove.
(get_ref_from_reference_ops): Export.
* tree-ssa-sccvn.h (struct vn_reference_s): Replace vuses
vector with single vuse pointer.
(vn_reference_lookup_pieces, vn_reference_lookup,
vn_reference_insert, vn_reference_insert_pieces): Adjust prototypes.
(shared_vuses_from_stmt): Remove.
(get_ref_from_reference_ops): Declare.
* tree-ssa-loop-manip.c (slpeel_can_duplicate_loop_p): Adjust.
* tree-ssa-copyrename.c (copy_rename_partition_coalesce): Remove
memory-tag related code.
* tree-ssa-ccp.c (get_symbol_constant_value): Remove memory-tag code.
(likely_value): Add comment, skip static-chain of call statements.
(surely_varying_stmt_p): Adjust.
(gimplify_and_update_call_from_tree): Likewise.
(execute_fold_all_builtins): Do not rebuild alias info.
(gimplify_and_update_call_from_tree): Properly update VOPs.
* tree-ssa-loop-ivopts.c (get_ref_tag): Remove.
(copy_ref_info): Remove memory-tag related code.
* tree-call-cdce.c (tree_call_cdce): Rename the VOP.
* ipa-pure-const.c (check_decl): Remove memory-tag related code.
(check_stmt): Open-code gimple_loaded_syms and gimple_stored_syms
computation.
* tree-ssa-dom.c (gimple_p): Remove typedef.
(eliminate_redundant_computations): Adjust.
(record_equivalences_from_stmt): Likewise.
(avail_expr_hash): Likewise.
(avail_expr_eq): Likewise.
* tree-ssa-propagate.c (update_call_from_tree): Properly
update VOPs.
(stmt_makes_single_load): Likewise.
(stmt_makes_single_store): Likewise.
* tree-ssa-alias.c: Rewrite completely.
(debug_memory_partitions, dump_mem_ref_stats, debug_mem_ref_stats,
debug_mem_sym_stats, dump_mem_sym_stats_for_var,
debug_all_mem_sym_stats, debug_mp_info, update_mem_sym_stats_from_stmt,
delete_mem_ref_stats, create_tag_raw, dump_points_to_info,
dump_may_aliases_for, debug_may_aliases_for, new_type_alias):
Remove public functions.
(pass_reset_cc_flags): Remove.
(pass_build_alias): Move ...
* tree-ssa-structalias.c (pass_build_alias): ... here.
* tree-ssa-alias.c (may_be_aliased): Move ...
* tree-flow-inline.h (may_be_aliased): ... here.
tree-ssa-alias.c (struct count_ptr_d, count_ptr_derefs,
count_uses_and_derefs): Move ...
* gimple.c: ... here.
* gimple.h (count_uses_and_derefs): Declare.
* tree-ssa-alias.c (dump_alias_stats, ptr_deref_may_alias_global_p,
ptr_deref_may_alias_decl_p, ptr_derefs_may_alias_p,
same_type_for_tbaa, nonaliasing_component_refs_p, decl_refs_may_alias_p,
indirect_ref_may_alias_decl_p, indirect_refs_may_alias_p,
ref_maybe_used_by_call_p, ref_maybe_used_by_stmt_p,
call_may_clobber_ref_p, stmt_may_clobber_ref_p, maybe_skip_until,
get_continuation_for_phi, walk_non_aliased_vuses, walk_aliased_vdefs):
New functions.
* tree-dfa.c (refs_may_alias_p): Move ...
* tree-ssa-alias.c (refs_may_alias_p): ... here. Extend.
* tree-ssa-alias.h: New file.
* tree-ssa-sink.c (is_hidden_global_store): Adjust.
(statement_sink_location): Likewise.
* opts.c (decode_options): Do not adjust max-aliased-vops or
avg-aliased-vops values.
* timevar.def (TV_TREE_MAY_ALIAS): Remove.
(TV_CALL_CLOBBER): Likewise.
(TV_FLOW_SENSITIVE): Likewise.
(TV_FLOW_INSENSITIVE): Likewise.
(TV_MEMORY_PARTITIONING): Likewise.
(TV_ALIAS_STMT_WALK): New timevar.
* tree-ssa-loop-ivcanon.c (empty_loop_p): Adjust.
* tree-ssa-address.c (create_mem_ref_raw): Use build6.
(get_address_description): Remove memory-tag related code.
* tree-ssa-ifcombine.c (bb_no_side_effects_p): Adjust.
* treestruct.def (TS_MEMORY_TAG, TS_MEMORY_PARTITION_TAG): Remove.
* tree-eh.c (cleanup_empty_eh): Do not leave stale SSA_NAMEs
and immediate uses in statements. Document.
* gimple-pretty-print.c (dump_gimple_mem_ops): Adjust.
(dump_symbols): Remove.
(dump_gimple_mem_ops): Do not dump loaded or stored syms.
* alias.c (get_deref_alias_set): New function split out from ...
(get_alias_set): ... here.
* alias.h (get_deref_alias_set): Declare.
* tree-vect-data-refs.c (vect_create_data_ref_ptr): Remove unused
type parameter. Remove restrict pointer handling. Create a
ref-all pointer in case type-based alias sets do not conflict.
(vect_analyze_data_refs): Remove SMT related code.
* tree-vect-stmts.c (vectorizable_store): Re-instantiate TBAA assert.
(vectorizable_load): Likewise.
* tree-data-ref.h (struct dr_alias): Remove symbol_tag field.
(DR_SYMBOL_TAG, DR_VOPS): Remove.
* tree-data-ref.c (dr_may_alias_p): Use the alias-oracle.
Ignore vops and SMTs.
(dr_analyze_alias): Likewise..
(free_data_ref): Likewise.
(create_data_ref): Likewise.
(analyze_all_data_dependences): Likewise.
(get_references_in_stmt): Adjust.
* tree-flow-inline.h (gimple_aliases_computed_p,
gimple_addressable_vars, gimple_call_clobbered_vars,
gimple_call_used_vars, gimple_global_var, may_aliases, memory_partition,
factoring_name_p, mark_call_clobbered, clear_call_clobbered,
compare_ssa_operands_equal, symbol_mem_tag, set_symbol_mem_tag,
gimple_mem_ref_stats): Remove.
(gimple_vop): New function.
(op_iter_next_use): Remove vuses and mayuses cases.
(op_iter_next_def): Remove vdefs case.
(op_iter_next_tree): Remove vuses, mayuses and vdefs cases.
(clear_and_done_ssa_iter): Do not set removed fields.
(op_iter_init): Likewise. Skip vuse and/or vdef if requested.
Assert we are not iterating over vuses or vdefs if not also
iterating over uses or defs.
(op_iter_init_use): Likewise.
(op_iter_init_def): Likewise.
(op_iter_next_vdef): Remove.
(op_iter_next_mustdef): Likewise.
(op_iter_init_vdef): Likewise.
(compare_ssa_operands_equal): Likewise.
(link_use_stmts_after): Handle vuse operand.
(is_call_used): Use is_call_clobbered.
(is_call_clobbered): Global variables are always call clobbered,
query the call-clobbers bitmap.
(mark_call_clobbered): Ignore global variables.
(clear_call_clobbered): Likewise.
* tree-ssa-coalesce.c (create_outofssa_var_map): Adjust
virtual operands sanity check.
* tree.def (NAME_MEMORY_TAG, SYMBOL_MEMORY_TAG, MEMORY_PARTITION_TAG):
Remove.
(TARGET_MEM_REF): Remove TMR_TAG operand.
* tree-dfa.c (add_referenced_var): Initialize call-clobber state.
Remove call-clobber related code.
(remove_referenced_var): Likewise. Do not clear mpt or symbol_mem_tag.
(dump_variable): Do not dump SMTs, memory stats, may-aliases or
partitions or escape reason.
(get_single_def_stmt, get_single_def_stmt_from_phi,
get_single_def_stmt_with_phi): Remove.
(dump_referenced_vars): Tidy.
(get_ref_base_and_extent): Allow bare decls.
(collect_dfa_stats): Adjust.
* graphite.c (rename_variables_in_stmt): Adjust.
(graphite_copy_stmts_from_block): Likewise.
(translate_clast): Likewise.
* tree-ssa-pre.c (struct bb_bitmap_sets): Add expr_dies bitmap.
(EXPR_DIES): New.
(translate_vuse_through_block): Use the oracle.
(phi_translate_1): Adjust.
(value_dies_in_block_x): Use the oracle. Cache the outcome
in EXPR_DIES.
(valid_in_sets): Check if the VUSE for
a REFERENCE is available.
(eliminate): Do not remove stmts during elimination,
instead queue and remove them afterwards.
(do_pre): Do not rebuild alias info.
(pass_pre): Run TODO_rebuild_alias before PRE.
* tree-ssa-live.c (remove_unused_locals): Remove memory-tag code.
* tree-sra.c (sra_walk_function): Use gimple_references_memory_p.
(mark_all_v_defs_stmt): Remove.
(mark_all_v_defs_seq): Adjust.
(sra_replace): Likewise.
(scalarize_use): Likewise.
(scalarize_copy): Likewise.
(scalarize_init): Likewise.
(scalarize_ldst): Likewise.
(todoflags): Remove.
(tree_sra): Do not rebuild alias info.
(tree_sra_early): Adjust.
(pass_sra): Run TODO_update_address_taken before SRA.
* tree-predcom.c (set_alias_info): Remove.
(prepare_initializers_chain): Do not call it.
(mark_virtual_ops_for_renaming): Adjust.
(mark_virtual_ops_for_renaming_list): Remove.
(initialize_root_vars): Adjust.
(initialize_root_vars_lm): Likewise.
(prepare_initializers_chain): Likewise.
* tree-ssa-copy.c (may_propagate_copy): Remove memory-tag related code.
(may_propagate_copy_into_stmt): Likewise.
(merge_alias_info): Do nothing for now.
(propagate_tree_value_into_stmt): Adjust.
(stmt_may_generate_copy): Likewise.
* tree-ssa-forwprop.c (tidy_after_forward_propagate_addr): Do
not mark symbols for renaming.
(forward_propagate_addr_expr): Match up push/pop_stmt_changes
with the same statement, make sure to update the new pointed-to one.
* tree-ssa-dce.c (eliminate_unnecessary_stmts): Do not copy
call statements, do not mark symbols for renaming.
(mark_operand_necessary): Dump something.
(ref_may_be_aliased): New function.
(mark_aliased_reaching_defs_necessary_1): New helper function.
(mark_aliased_reaching_defs_necessary): Likewise.
(mark_all_reaching_defs_necessary_1): Likewise.
(mark_all_reaching_defs_necessary): Likewise.
(propagate_necessity): Do not process virtual PHIs. For
non-aliased loads mark all reaching definitions as necessary.
For aliased loads and stores mark the immediate dominating
aliased clobbers as necessary.
(visited): New global static.
(perform_tree_ssa_dce): Free visited bitmap after propagating
necessity.
(remove_dead_phis): Perform simple dead virtual PHI removal.
(remove_dead_stmt): Properly unlink virtual operands when
removing stores.
(eliminate_unnecessary_stmts): Schedule PHI removal after
stmt removal.
* tree-ssa-ter.c (is_replaceable_p): Adjust.
(process_replaceable): Likewise.
(find_replaceable_in_bb): Likewise.
* tree-ssa.c (verify_ssa_name): Verify all VOPs are
based on the single gimple vop.
(verify_flow_insensitive_alias_info): Remove.
(verify_flow_sensitive_alias_info): Likewise.
(verify_call_clobbering): Likewise.
(verify_memory_partitions): Likewise.
(verify_alias_info): Likewise.
(verify_ssa): Adjust..
(execute_update_addresses_taken): Export. Update SSA
manually. Optimize only when optimizing. Use a local bitmap.
(pass_update_address_taken): Remove TODO_update_ssa, add
TODO_dump_func.
(pass_update_address_taken): Just use TODO_update_address_taken.
(init_tree_ssa): Do not initialize addressable_vars.
(verify_ssa): Verify new VUSE / VDEF properties.
Verify that all stmts definitions have the stmt as SSA_NAME_DEF_STMT.
Do not call verify_alias_info.
(delete_tree_ssa): Clear the VUSE, VDEF operands.
Do not free the loaded and stored syms bitmaps. Reset the escaped
and callused solutions. Do not free addressable_vars.
Remove memory-tag related code.
(warn_uninitialized_var): Aliases are always available.
* tree-ssa-loop-prefetch.c (gather_memory_references): Adjust.
* lambda-code.c (can_put_in_inner_loop): Adjust.
(can_put_after_inner_loop): Likewise.
(perfect_nestify): Likewise.
* tree-vect-stmts.c (vect_stmt_relevant_p): Adjust.
(vect_gen_widened_results_half): Remove CALL_EXPR handling.
(vectorizable_conversion): Do not mark symbols for renaming.
* tree-inline.c (remap_gimple_stmt): Clear VUSE/VDEF.
(expand_call_inline): Unlink the calls virtual operands before
replacing it.
(tree_function_versioning): Do not call update_ssa if we are not
updating clones. Simplify.
* tree-ssa-phiprop.c (phivn_valid_p): Adjust.
(propagate_with_phi): Likewise..
* tree-outof-ssa.c (create_temp): Remove memory tag and call
clobber code. Assert we are not aliased or global.
* tree-flow.h: Include tree-ssa-alias.h
(enum escape_type): Remove.
(struct mem_sym_stats_d): Likewise.
(struct mem_ref_stats_d): Likewise.
(struct gimple_df): Add vop member. Remove global_var,
call_clobbered_vars, call_used_vars, addressable_vars,
aliases_compted_p and mem_ref_stats members. Add syms_to_rename,
escaped and callused members.
(struct ptr_info_def): Remove all members, add points-to solution
member pt.
(struct var_ann_d): Remove in_vuse_list, in_vdef_list,
call_clobbered, escape_mask, mpt and symbol_mem_tag members.
* Makefile.in (TREE_FLOW_H): Add tree-ssa-alias.h.
(tree-ssa-structalias.o): Remove tree-ssa-structalias.h.
(tree-ssa-alias.o): Likewise.
(toplev.o): Add tree-ssa-alias.h
(GTFILES): Remove tree-ssa-structalias.h, add tree-ssa-alias.h.
* gimple.c (gimple_set_bb): Fix off-by-one error.
(is_gimple_reg): Do not handle memory tags.
(gimple_copy): Also copy virtual operands.
Delay updating the statement. Do not reset loaded and stored syms.
(gimple_set_stored_syms): Remove.
(gimple_set_loaded_syms): Likewise.
(gimple_call_copy_skip_args): Copy the virtual operands
and mark the new statement modified.
* tree-ssa-structalias.c (may_alias_p): Remove.
(set_uids_in_ptset): Take the alias set to prune with as
parameter. Fold in the alias test of may_alias_p.
(compute_points_to_sets): Compute whether a ptr is dereferenced
in a local sbitmap.
(process_constraint): Deal with &ANYTHING on the lhs, reject all
other ADDRESSOF constraints on the lhs.
(get_constraint_for_component_ref): Assert that we don't get
ADDRESSOF constraints from the base of the reference.
Properly generate UNKNOWN_OFFSET for DEREF if needed.
(struct variable_info): Remove collapsed_to member.
(get_varinfo_fc): Remove.
(new_var_info): Do not set collapsed_to.
(dump_constraint): Do not follow cycles.
(dump_constraint_graph): Likewise.
(build_pred_graph): Likewise.
(build_succ_graph): Likewise.
(rewrite_constraints): Likewise.
(do_simple_structure_copy): Remove.
(do_rhs_deref_structure_copy): Remove.
(do_lhs_deref_structure_copy): Remove.
(collapse_rest_of_var): Remove.
(do_structure_copy): Re-implement.
(pta_stats): New global variable.
(dump_pta_stats): New function.
(struct constraint_expr): Make offset signed.
(UNKNOWN_OFFSET): Define special value.
(dump_constraint): Dump UNKNOWN_OFFSET as UNKNOWN.
(solution_set_expand): New helper function split out from ...
(do_sd_constraint): ... here.
(solution_set_add): Handle UNKNOWN_OFFSET. Handle negative offsets.
(do_ds_constraint): Likewise.
(do_sd_constraint): Likewise. Do not special-case ESCAPED = *ESCAPED
and CALLUSED = *CALLUSED.
(set_union_with_increment): Make inc argument signed.
(type_safe): Remove.
(get_constraint_for_ptr_offset): Handle unknown and negative
constant offsets.
(first_vi_for_offset): Handle offsets before start. Bail
out early for offsets beyond the variable extent.
(first_or_preceding_vi_for_offset): New function.
(init_base_vars): Add ESCAPED = ESCAPED + UNKNOWN_OFFSET constraint.
Together with ESCAPED = *ESCAPED this properly computes reachability.
(find_what_var_points_to): New function.
(find_what_p_points_to): Implement in terms of find_what_var_points_to.
(pt_solution_reset, pt_solution_empty_p, pt_solution_includes_global,
pt_solution_includes_1, pt_solution_includes, pt_solutions_intersect_1,
pt_solutions_intersect): New functions.
(compute_call_used_vars): Remove.
(compute_may_aliases): New main entry into PTA computation.
* gimple.h (gimple_p): New typedef.
(struct gimple_statement_base): Remove references_memory_p.
(struct gimple_statement_with_memory_ops_base): Remove
vdef_ops, vuse_ops, stores and loads members. Add vdef and vuse
members.
(gimple_vuse_ops, gimple_set_vuse_ops, gimple_vdef_ops,
gimple_set_vdef_ops, gimple_loaded_syms, gimple_stored_syms,
gimple_set_references_memory): Remove.
(gimple_vuse_op, gimple_vdef_op, gimple_vuse, gimple_vdef,
gimple_vuse_ptr, gimple_vdef_ptri, gimple_set_vuse, gimple_set_vdef):
New functions.
* tree-cfg.c (move_block_to_fn): Fix off-by-one error.
(verify_expr): Allow RESULT_DECL.
(gimple_duplicate_bb): Do not copy virtual operands.
(gimple_duplicate_sese_region): Adjust.
(gimple_duplicate_sese_tail): Likewise.
(mark_virtual_ops_in_region): Remove.
(move_sese_region_to_fn): Do not call it.
* passes.c (init_optimization_passes): Remove pass_reset_cc_flags
and pass_simple_dse.
(execute_function_todo): Handle TODO_update_address_taken,
call execute_update_addresses_taken for TODO_rebuild_alias.
(execute_todo): Adjust.
(execute_one_pass): Init dump files early.
* ipa-struct-reorg.c (finalize_var_creation): Do not mark vars
call-clobbered.
(create_general_new_stmt): Clear vops.
* tree-ssa-reassoc.c (get_rank): Adjust.
* tree-vect-slp.c (vect_create_mask_and_perm): Do not mark
symbols for renaming.
* params.def (PARAM_MAX_ALIASED_VOPS): Remove.
(PARAM_AVG_ALIASED_VOPS): Likewise.
* tree-ssanames.c (init_ssanames): Allocate SYMS_TO_RENAME.
(duplicate_ssa_name_ptr_info): No need to copy the shared bitmaps.
* tree-ssa-operands.c: Simplify for new virtual operand
representation.
(operand_build_cmp, copy_virtual_operands,
create_ssa_artificial_load_stmt, add_to_addressable_set,
gimple_add_to_addresses_taken): Remove public functions.
(unlink_stmt_vdef): New function.
* gcc.dg/pr19633-1.c: Adjust.
* gcc.dg/torture/pta-callused-1.c: Likewise.
* gcc.dg/torture/pr39074-2.c: Likewise.
* gcc.dg/torture/pr39074.c: Likewise.
* gcc.dg/torture/pta-ptrarith-3.c: New testcase.
* gcc.dg/torture/pr30375.c: Adjust.
* gcc.dg/torture/pr33563.c: Likewise.
* gcc.dg/torture/pr33870.c: Likewise.
* gcc.dg/torture/pr33560.c: Likewise.
* gcc.dg/torture/pta-structcopy-1.c: New testcase.
* gcc.dg/torture/ssa-pta-fn-1.c: Likewise.
* gcc.dg/tree-ssa/alias-15.c: Remove.
* gcc.dg/tree-ssa/ssa-dce-4.c: New testcase.
* gcc.dg/tree-ssa/pr26421.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-10.c: XFAIL.
* gcc.dg/tree-ssa/ssa-dce-5.c: New testcase.
* gcc.dg/tree-ssa/pr23382.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-20.c: New testcase.
* gcc.dg/tree-ssa/alias-16.c: Adjust.
* gcc.dg/tree-ssa/ssa-fre-13.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-14.c: Likewise.
* gcc.dg/tree-ssa/alias-18.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-15.c: Likewise.
* gcc.dg/tree-ssa/ssa-lim-3.c: Likewise.
* gcc.dg/tree-ssa/alias-19.c: Likewise.
* gcc.dg/tree-ssa/pta-ptrarith-1.c: New testcase.
* gcc.dg/tree-ssa/pr13146.c: Likewise.
* gcc.dg/tree-ssa/ssa-pre-23.c: Likewise.
* gcc.dg/tree-ssa/pta-ptrarith-2.c: Likewise.
* gcc.dg/tree-ssa/ssa-fre-18.c: Likewise.
* gcc.dg/tree-ssa/ssa-pre-24.c: New XFAILed testcase.
* gcc.dg/tree-ssa/ssa-fre-19.c: New testcase.
* gcc.dg/tree-ssa/alias-20.c: Likewise.
* gcc.dg/tree-ssa/ssa-dse-12.c: Likewise.
* gcc.dg/tree-ssa/pr38895.c: Likewise.
* gcc.dg/uninit-B.c: XFAIL.
* gcc.dg/vect/no-vfa-vect-43.c: Adjust.
* gcc.dg/uninit-pr19430.c: XFAIL.
* g++.dg/tree-ssa/pr13146.C: New testcase.
* g++.dg/opt/pr36187.C: Adjust.
* g++.dg/torture/20090329-1.C: New testcase.
git-svn-id: svn+ssh://gcc.gnu.org/svn/gcc/trunk@145494 138bc75d-0d04-0410-961f-82ee72b054a4
Diffstat (limited to 'gcc/tree-ssa-operands.c')
-rw-r--r-- | gcc/tree-ssa-operands.c | 1779 |
1 files changed, 242 insertions, 1537 deletions
diff --git a/gcc/tree-ssa-operands.c b/gcc/tree-ssa-operands.c index 85a0a08b3da..56d8e84a73b 100644 --- a/gcc/tree-ssa-operands.c +++ b/gcc/tree-ssa-operands.c @@ -75,11 +75,6 @@ along with GCC; see the file COPYING3. If not see operand vector for VUSE, then the new vector will also be modified such that it contains 'a_5' rather than 'a'. */ -/* Helper functions from gimple.c. These are GIMPLE manipulation - routines that only the operand scanner should need. */ -void gimple_set_stored_syms (gimple, bitmap, bitmap_obstack *); -void gimple_set_loaded_syms (gimple, bitmap, bitmap_obstack *); - /* Structure storing statistics on how many call clobbers we have, and how many where avoided. */ @@ -137,69 +132,26 @@ static VEC(tree,heap) *build_defs; /* Array for building all the use operands. */ static VEC(tree,heap) *build_uses; -/* Set for building all the VDEF operands. */ -static VEC(tree,heap) *build_vdefs; +/* The built VDEF operand. */ +static tree build_vdef; -/* Set for building all the VUSE operands. */ -static VEC(tree,heap) *build_vuses; +/* The built VUSE operand. */ +static tree build_vuse; /* Bitmap obstack for our datastructures that needs to survive across compilations of multiple functions. */ static bitmap_obstack operands_bitmap_obstack; -/* Set for building all the loaded symbols. */ -static bitmap build_loads; - -/* Set for building all the stored symbols. */ -static bitmap build_stores; - static void get_expr_operands (gimple, tree *, int); /* Number of functions with initialized ssa_operands. */ static int n_initialized = 0; -/* Statement change buffer. Data structure used to record state - information for statements. This is used to determine what needs - to be done in order to update the SSA web after a statement is - modified by a pass. If STMT is a statement that has just been - created, or needs to be folded via fold_stmt, or anything that - changes its physical structure then the pass should: - - 1- Call push_stmt_changes (&stmt) to record the current state of - STMT before any modifications are made. - - 2- Make all appropriate modifications to the statement. - - 3- Call pop_stmt_changes (&stmt) to find new symbols that - need to be put in SSA form, SSA name mappings for names that - have disappeared, recompute invariantness for address - expressions, cleanup EH information, etc. - - If it is possible to determine that the statement was not modified, - instead of calling pop_stmt_changes it is quicker to call - discard_stmt_changes to avoid the expensive and unnecessary operand - re-scan and change comparison. */ - -struct scb_d -{ - /* Pointer to the statement being modified. */ - gimple *stmt_p; - - /* If the statement references memory these are the sets of symbols - loaded and stored by the statement. */ - bitmap loads; - bitmap stores; -}; - -typedef struct scb_d *scb_t; -DEF_VEC_P(scb_t); -DEF_VEC_ALLOC_P(scb_t,heap); - -/* Stack of statement change buffers (SCB). Every call to - push_stmt_changes pushes a new buffer onto the stack. Calls to - pop_stmt_changes pop a buffer off of the stack and compute the set +/* Stack of statements to change. Every call to + push_stmt_changes pushes the stmt onto the stack. Calls to + pop_stmt_changes pop a stmt off of the stack and compute the set of changes for the popped statement. */ -static VEC(scb_t,heap) *scb_stack; +static VEC(gimple_p,heap) *scb_stack; /* Return the DECL_UID of the base variable of T. */ @@ -213,54 +165,6 @@ get_name_decl (const_tree t) } -/* Comparison function for qsort used in operand_build_sort_virtual. */ - -int -operand_build_cmp (const void *p, const void *q) -{ - const_tree const e1 = *((const_tree const *)p); - const_tree const e2 = *((const_tree const *)q); - const unsigned int u1 = get_name_decl (e1); - const unsigned int u2 = get_name_decl (e2); - - /* We want to sort in ascending order. They can never be equal. */ -#ifdef ENABLE_CHECKING - gcc_assert (u1 != u2); -#endif - return (u1 > u2 ? 1 : -1); -} - - -/* Sort the virtual operands in LIST from lowest DECL_UID to highest. */ - -static inline void -operand_build_sort_virtual (VEC(tree,heap) *list) -{ - int num = VEC_length (tree, list); - - if (num < 2) - return; - - if (num == 2) - { - if (get_name_decl (VEC_index (tree, list, 0)) - > get_name_decl (VEC_index (tree, list, 1))) - { - /* Swap elements if in the wrong order. */ - tree tmp = VEC_index (tree, list, 0); - VEC_replace (tree, list, 0, VEC_index (tree, list, 1)); - VEC_replace (tree, list, 1, tmp); - } - return; - } - - /* There are 3 or more elements, call qsort. */ - qsort (VEC_address (tree, list), - VEC_length (tree, list), - sizeof (tree), - operand_build_cmp); -} - /* Return true if the SSA operands cache is active. */ bool @@ -276,94 +180,44 @@ ssa_operands_active (void) return cfun->gimple_df && gimple_ssa_operands (cfun)->ops_active; } + +/* Create the VOP variable, an artificial global variable to act as a + representative of all of the virtual operands FUD chain. */ -/* VOPs are of variable sized, so the free list maps "free buckets" to the - following table: - bucket # operands - ------ ---------- - 0 1 - 1 2 - ... - 15 16 - 16 17-24 - 17 25-32 - 18 31-40 - ... - 29 121-128 - Any VOPs larger than this are simply added to the largest bucket when they - are freed. */ - - -/* Return the number of operands used in bucket BUCKET. */ - -static inline int -vop_free_bucket_size (int bucket) -{ -#ifdef ENABLE_CHECKING - gcc_assert (bucket >= 0 && bucket < NUM_VOP_FREE_BUCKETS); -#endif - if (bucket < 16) - return bucket + 1; - return (bucket - 13) * 8; -} - - -/* For a vop of NUM operands, return the bucket NUM belongs to. If NUM is - beyond the end of the bucket table, return -1. */ - -static inline int -vop_free_bucket_index (int num) -{ - gcc_assert (num > 0 && NUM_VOP_FREE_BUCKETS > 16); - - /* Sizes 1 through 16 use buckets 0-15. */ - if (num <= 16) - return num - 1; - /* Buckets 16 - NUM_VOP_FREE_BUCKETS represent 8 unit chunks. */ - num = 14 + (num - 1) / 8; - if (num >= NUM_VOP_FREE_BUCKETS) - return -1; - else - return num; -} - - -/* Initialize the VOP free buckets. */ - -static inline void -init_vop_buckets (void) -{ - int x; - - for (x = 0; x < NUM_VOP_FREE_BUCKETS; x++) - gimple_ssa_operands (cfun)->vop_free_buckets[x] = NULL; -} - - -/* Add PTR to the appropriate VOP bucket. */ - -static inline void -add_vop_to_freelist (voptype_p ptr) +static void +create_vop_var (void) { - int bucket = vop_free_bucket_index (VUSE_VECT_NUM_ELEM (ptr->usev)); - - /* Too large, use the largest bucket so its not a complete throw away. */ - if (bucket == -1) - bucket = NUM_VOP_FREE_BUCKETS - 1; - - ptr->next = gimple_ssa_operands (cfun)->vop_free_buckets[bucket]; - gimple_ssa_operands (cfun)->vop_free_buckets[bucket] = ptr; + tree global_var; + + gcc_assert (cfun->gimple_df->vop == NULL_TREE); + + global_var = build_decl (VAR_DECL, get_identifier (".MEM"), + void_type_node); + DECL_ARTIFICIAL (global_var) = 1; + TREE_READONLY (global_var) = 0; + DECL_EXTERNAL (global_var) = 1; + TREE_STATIC (global_var) = 1; + TREE_USED (global_var) = 1; + DECL_CONTEXT (global_var) = NULL_TREE; + TREE_THIS_VOLATILE (global_var) = 0; + TREE_ADDRESSABLE (global_var) = 0; + + create_var_ann (global_var); + add_referenced_var (global_var); + cfun->gimple_df->vop = global_var; } - -/* These are the sizes of the operand memory buffer which gets allocated each - time more operands space is required. The final value is the amount that is - allocated every time after that. */ +/* These are the sizes of the operand memory buffer in bytes which gets + allocated each time more operands space is required. The final value is + the amount that is allocated every time after that. + In 1k we can fit 25 use operands (or 63 def operands) on a host with + 8 byte pointers, that would be 10 statements each with 1 def and 2 + uses. */ #define OP_SIZE_INIT 0 -#define OP_SIZE_1 30 -#define OP_SIZE_2 110 -#define OP_SIZE_3 511 +#define OP_SIZE_1 (1024 - sizeof (void *)) +#define OP_SIZE_2 (1024 * 4 - sizeof (void *)) +#define OP_SIZE_3 (1024 * 16 - sizeof (void *)) /* Initialize the operand cache routines. */ @@ -374,22 +228,19 @@ init_ssa_operands (void) { build_defs = VEC_alloc (tree, heap, 5); build_uses = VEC_alloc (tree, heap, 10); - build_vuses = VEC_alloc (tree, heap, 25); - build_vdefs = VEC_alloc (tree, heap, 25); + build_vuse = NULL_TREE; + build_vdef = NULL_TREE; bitmap_obstack_initialize (&operands_bitmap_obstack); - build_loads = BITMAP_ALLOC (&operands_bitmap_obstack); - build_stores = BITMAP_ALLOC (&operands_bitmap_obstack); - scb_stack = VEC_alloc (scb_t, heap, 20); + scb_stack = VEC_alloc (gimple_p, heap, 20); } gcc_assert (gimple_ssa_operands (cfun)->operand_memory == NULL); - gcc_assert (gimple_ssa_operands (cfun)->mpt_table == NULL); gimple_ssa_operands (cfun)->operand_memory_index = gimple_ssa_operands (cfun)->ssa_operand_mem_size; gimple_ssa_operands (cfun)->ops_active = true; memset (&clobber_stats, 0, sizeof (clobber_stats)); - init_vop_buckets (); gimple_ssa_operands (cfun)->ssa_operand_mem_size = OP_SIZE_INIT; + create_vop_var (); } @@ -399,21 +250,17 @@ void fini_ssa_operands (void) { struct ssa_operand_memory_d *ptr; - unsigned ix; - tree mpt; if (!--n_initialized) { VEC_free (tree, heap, build_defs); VEC_free (tree, heap, build_uses); - VEC_free (tree, heap, build_vdefs); - VEC_free (tree, heap, build_vuses); - BITMAP_FREE (build_loads); - BITMAP_FREE (build_stores); + build_vdef = NULL_TREE; + build_vuse = NULL_TREE; /* The change buffer stack had better be empty. */ - gcc_assert (VEC_length (scb_t, scb_stack) == 0); - VEC_free (scb_t, heap, scb_stack); + gcc_assert (VEC_length (gimple_p, scb_stack) == 0); + VEC_free (gimple_p, heap, scb_stack); scb_stack = NULL; } @@ -427,21 +274,13 @@ fini_ssa_operands (void) ggc_free (ptr); } - for (ix = 0; - VEC_iterate (tree, gimple_ssa_operands (cfun)->mpt_table, ix, mpt); - ix++) - { - if (mpt) - BITMAP_FREE (MPT_SYMBOLS (mpt)); - } - - VEC_free (tree, heap, gimple_ssa_operands (cfun)->mpt_table); - gimple_ssa_operands (cfun)->ops_active = false; if (!n_initialized) bitmap_obstack_release (&operands_bitmap_obstack); + cfun->gimple_df->vop = NULL_TREE; + if (dump_file && (dump_flags & TDF_STATS)) { fprintf (dump_file, "Original clobbered vars: %d\n", @@ -460,47 +299,45 @@ fini_ssa_operands (void) } -/* Return memory for operands of SIZE chunks. */ +/* Return memory for an operand of size SIZE. */ static inline void * ssa_operand_alloc (unsigned size) { char *ptr; + gcc_assert (size == sizeof (struct use_optype_d) + || size == sizeof (struct def_optype_d)); + if (gimple_ssa_operands (cfun)->operand_memory_index + size >= gimple_ssa_operands (cfun)->ssa_operand_mem_size) { struct ssa_operand_memory_d *ptr; - if (gimple_ssa_operands (cfun)->ssa_operand_mem_size == OP_SIZE_INIT) - gimple_ssa_operands (cfun)->ssa_operand_mem_size - = OP_SIZE_1 * sizeof (struct voptype_d); - else - if (gimple_ssa_operands (cfun)->ssa_operand_mem_size - == OP_SIZE_1 * sizeof (struct voptype_d)) - gimple_ssa_operands (cfun)->ssa_operand_mem_size - = OP_SIZE_2 * sizeof (struct voptype_d); - else - gimple_ssa_operands (cfun)->ssa_operand_mem_size - = OP_SIZE_3 * sizeof (struct voptype_d); - - /* Go right to the maximum size if the request is too large. */ - if (size > gimple_ssa_operands (cfun)->ssa_operand_mem_size) - gimple_ssa_operands (cfun)->ssa_operand_mem_size - = OP_SIZE_3 * sizeof (struct voptype_d); - - /* We can reliably trigger the case that we need arbitrary many - operands (see PR34093), so allocate a buffer just for this request. */ - if (size > gimple_ssa_operands (cfun)->ssa_operand_mem_size) - gimple_ssa_operands (cfun)->ssa_operand_mem_size = size; + switch (gimple_ssa_operands (cfun)->ssa_operand_mem_size) + { + case OP_SIZE_INIT: + gimple_ssa_operands (cfun)->ssa_operand_mem_size = OP_SIZE_1; + break; + case OP_SIZE_1: + gimple_ssa_operands (cfun)->ssa_operand_mem_size = OP_SIZE_2; + break; + case OP_SIZE_2: + case OP_SIZE_3: + gimple_ssa_operands (cfun)->ssa_operand_mem_size = OP_SIZE_3; + break; + default: + gcc_unreachable (); + } ptr = (struct ssa_operand_memory_d *) - ggc_alloc (sizeof (struct ssa_operand_memory_d) - + gimple_ssa_operands (cfun)->ssa_operand_mem_size - 1); + ggc_alloc (sizeof (void *) + + gimple_ssa_operands (cfun)->ssa_operand_mem_size); ptr->next = gimple_ssa_operands (cfun)->operand_memory; gimple_ssa_operands (cfun)->operand_memory = ptr; gimple_ssa_operands (cfun)->operand_memory_index = 0; } + ptr = &(gimple_ssa_operands (cfun)->operand_memory ->mem[gimple_ssa_operands (cfun)->operand_memory_index]); gimple_ssa_operands (cfun)->operand_memory_index += size; @@ -546,55 +383,6 @@ alloc_use (void) } -/* Allocate a vop with NUM elements. */ - -static inline struct voptype_d * -alloc_vop (int num) -{ - struct voptype_d *ret = NULL; - int alloc_size = 0; - - int bucket = vop_free_bucket_index (num); - if (bucket != -1) - { - /* If there is a free operand, use it. */ - if (gimple_ssa_operands (cfun)->vop_free_buckets[bucket] != NULL) - { - ret = gimple_ssa_operands (cfun)->vop_free_buckets[bucket]; - gimple_ssa_operands (cfun)->vop_free_buckets[bucket] = - gimple_ssa_operands (cfun)->vop_free_buckets[bucket]->next; - } - else - alloc_size = vop_free_bucket_size(bucket); - } - else - alloc_size = num; - - if (alloc_size > 0) - ret = (struct voptype_d *)ssa_operand_alloc ( - sizeof (struct voptype_d) + (alloc_size - 1) * sizeof (vuse_element_t)); - - VUSE_VECT_NUM_ELEM (ret->usev) = num; - return ret; -} - - -/* This routine makes sure that PTR is in an immediate use list, and makes - sure the stmt pointer is set to the current stmt. */ - -static inline void -set_virtual_use_link (use_operand_p ptr, gimple stmt) -{ - /* fold_stmt may have changed the stmt pointers. */ - if (ptr->loc.stmt != stmt) - ptr->loc.stmt = stmt; - - /* If this use isn't in a list, add it to the correct list. */ - if (!ptr->prev) - link_imm_use (ptr, *(ptr->use)); -} - - /* Adds OP to the list of defs after LAST. */ static inline def_optype_p @@ -626,56 +414,6 @@ add_use_op (gimple stmt, tree *op, use_optype_p last) } -/* Return a virtual op pointer with NUM elements which are all - initialized to OP and are linked into the immediate uses for STMT. - The new vop is appended after PREV. */ - -static inline voptype_p -add_vop (gimple stmt, tree op, int num, voptype_p prev) -{ - voptype_p new_vop; - int x; - - new_vop = alloc_vop (num); - for (x = 0; x < num; x++) - { - VUSE_OP_PTR (new_vop, x)->prev = NULL; - SET_VUSE_OP (new_vop, x, op); - VUSE_OP_PTR (new_vop, x)->use = &new_vop->usev.uses[x].use_var; - link_imm_use_stmt (VUSE_OP_PTR (new_vop, x), - new_vop->usev.uses[x].use_var, stmt); - } - - if (prev) - prev->next = new_vop; - new_vop->next = NULL; - return new_vop; -} - - -/* Adds OP to the list of vuses of statement STMT after LAST, and moves - LAST to the new element. */ - -static inline voptype_p -add_vuse_op (gimple stmt, tree op, int num, voptype_p last) -{ - voptype_p new_vop = add_vop (stmt, op, num, last); - VDEF_RESULT (new_vop) = NULL_TREE; - return new_vop; -} - - -/* Adds OP to the list of vdefs of statement STMT after LAST, and moves - LAST to the new element. */ - -static inline voptype_p -add_vdef_op (gimple stmt, tree op, int num, voptype_p last) -{ - voptype_p new_vop = add_vop (stmt, op, num, last); - VDEF_RESULT (new_vop) = op; - return new_vop; -} - /* Takes elements from build_defs and turns them into def operands of STMT. TODO -- Make build_defs VEC of tree *. */ @@ -691,6 +429,19 @@ finalize_ssa_defs (gimple stmt) /* There should only be a single real definition per assignment. */ gcc_assert ((stmt && gimple_code (stmt) != GIMPLE_ASSIGN) || num <= 1); + /* Pre-pend the vdef we may have built. */ + if (build_vdef != NULL_TREE) + { + tree oldvdef = gimple_vdef (stmt); + if (oldvdef + && TREE_CODE (oldvdef) == SSA_NAME) + oldvdef = SSA_NAME_VAR (oldvdef); + if (oldvdef != build_vdef) + gimple_set_vdef (stmt, build_vdef); + VEC_safe_insert (tree, heap, build_defs, 0, (tree)gimple_vdef_ptr (stmt)); + ++num; + } + new_list.next = NULL; last = &new_list; @@ -698,6 +449,23 @@ finalize_ssa_defs (gimple stmt) new_i = 0; + /* Clear and unlink a no longer necessary VDEF. */ + if (build_vdef == NULL_TREE + && gimple_vdef (stmt) != NULL_TREE) + { + if (TREE_CODE (gimple_vdef (stmt)) == SSA_NAME) + { + unlink_stmt_vdef (stmt); + release_ssa_name (gimple_vdef (stmt)); + } + gimple_set_vdef (stmt, NULL_TREE); + } + + /* If we have a non-SSA_NAME VDEF, mark it for renaming. */ + if (gimple_vdef (stmt) + && TREE_CODE (gimple_vdef (stmt)) != SSA_NAME) + mark_sym_for_renaming (gimple_vdef (stmt)); + /* Check for the common case of 1 def that hasn't changed. */ if (old_ops && old_ops->next == NULL && num == 1 && (tree *) VEC_index (tree, build_defs, 0) == DEF_OP_PTR (old_ops)) @@ -716,17 +484,6 @@ finalize_ssa_defs (gimple stmt) /* Now set the stmt's operands. */ gimple_set_def_ops (stmt, new_list.next); - -#ifdef ENABLE_CHECKING - { - def_optype_p ptr; - unsigned x = 0; - for (ptr = gimple_def_ops (stmt); ptr; ptr = ptr->next) - x++; - - gcc_assert (x == num); - } -#endif } @@ -740,11 +497,29 @@ finalize_ssa_uses (gimple stmt) struct use_optype_d new_list; use_optype_p old_ops, ptr, last; + /* Pre-pend the VUSE we may have built. */ + if (build_vuse != NULL_TREE) + { + tree oldvuse = gimple_vuse (stmt); + if (oldvuse + && TREE_CODE (oldvuse) == SSA_NAME) + oldvuse = SSA_NAME_VAR (oldvuse); + if (oldvuse != (build_vuse != NULL_TREE + ? build_vuse : build_vdef)) + gimple_set_vuse (stmt, NULL_TREE); + VEC_safe_insert (tree, heap, build_uses, 0, (tree)gimple_vuse_ptr (stmt)); + } + new_list.next = NULL; last = &new_list; old_ops = gimple_use_ops (stmt); + /* Clear a no longer necessary VUSE. */ + if (build_vuse == NULL_TREE + && gimple_vuse (stmt) != NULL_TREE) + gimple_set_vuse (stmt, NULL_TREE); + /* If there is anything in the old list, free it. */ if (old_ops) { @@ -754,6 +529,15 @@ finalize_ssa_uses (gimple stmt) gimple_ssa_operands (cfun)->free_uses = old_ops; } + /* If we added a VUSE, make sure to set the operand if it is not already + present and mark it for renaming. */ + if (build_vuse != NULL_TREE + && gimple_vuse (stmt) == NULL_TREE) + { + gimple_set_vuse (stmt, gimple_vop (cfun)); + mark_sym_for_renaming (gimple_vop (cfun)); + } + /* Now create nodes for all the new nodes. */ for (new_i = 0; new_i < VEC_length (tree, build_uses); new_i++) last = add_use_op (stmt, @@ -762,259 +546,6 @@ finalize_ssa_uses (gimple stmt) /* Now set the stmt's operands. */ gimple_set_use_ops (stmt, new_list.next); - -#ifdef ENABLE_CHECKING - { - unsigned x = 0; - for (ptr = gimple_use_ops (stmt); ptr; ptr = ptr->next) - x++; - - gcc_assert (x == VEC_length (tree, build_uses)); - } -#endif -} - - -/* Takes elements from BUILD_VDEFS and turns them into vdef operands of - STMT. */ - -static inline void -finalize_ssa_vdefs (gimple stmt) -{ - unsigned new_i; - struct voptype_d new_list; - voptype_p old_ops, ptr, last; - - /* Set the symbols referenced by STMT. */ - gimple_set_stored_syms (stmt, build_stores, &operands_bitmap_obstack); - - /* If aliases have not been computed, do not instantiate a virtual - operator on STMT. Initially, we only compute the SSA form on - GIMPLE registers. The virtual SSA form is only computed after - alias analysis, so virtual operators will remain unrenamed and - the verifier will complain. However, alias analysis needs to - access symbol load/store information, so we need to compute - those. */ - if (!gimple_aliases_computed_p (cfun)) - return; - - new_list.next = NULL; - last = &new_list; - - old_ops = gimple_vdef_ops (stmt); - new_i = 0; - while (old_ops && new_i < VEC_length (tree, build_vdefs)) - { - tree op = VEC_index (tree, build_vdefs, new_i); - unsigned new_uid = get_name_decl (op); - unsigned old_uid = get_name_decl (VDEF_RESULT (old_ops)); - - /* FIXME, for now each VDEF operator should have at most one - operand in their RHS. */ - gcc_assert (VDEF_NUM (old_ops) == 1); - - if (old_uid == new_uid) - { - /* If the symbols are the same, reuse the existing operand. */ - last->next = old_ops; - last = old_ops; - old_ops = old_ops->next; - last->next = NULL; - set_virtual_use_link (VDEF_OP_PTR (last, 0), stmt); - new_i++; - } - else if (old_uid < new_uid) - { - /* If old is less than new, old goes to the free list. */ - voptype_p next; - delink_imm_use (VDEF_OP_PTR (old_ops, 0)); - next = old_ops->next; - add_vop_to_freelist (old_ops); - old_ops = next; - } - else - { - /* This is a new operand. */ - last = add_vdef_op (stmt, op, 1, last); - new_i++; - } - } - - /* If there is anything remaining in BUILD_VDEFS, simply emit it. */ - for ( ; new_i < VEC_length (tree, build_vdefs); new_i++) - last = add_vdef_op (stmt, VEC_index (tree, build_vdefs, new_i), 1, last); - - /* If there is anything in the old list, free it. */ - if (old_ops) - { - for (ptr = old_ops; ptr; ptr = last) - { - last = ptr->next; - delink_imm_use (VDEF_OP_PTR (ptr, 0)); - add_vop_to_freelist (ptr); - } - } - - /* Now set STMT's operands. */ - gimple_set_vdef_ops (stmt, new_list.next); - -#ifdef ENABLE_CHECKING - { - unsigned x = 0; - for (ptr = gimple_vdef_ops (stmt); ptr; ptr = ptr->next) - x++; - - gcc_assert (x == VEC_length (tree, build_vdefs)); - } -#endif -} - - -/* Takes elements from BUILD_VUSES and turns them into VUSE operands of - STMT. */ - -static inline void -finalize_ssa_vuse_ops (gimple stmt) -{ - unsigned new_i, old_i; - voptype_p old_ops, last; - VEC(tree,heap) *new_ops; - - /* Set the symbols referenced by STMT. */ - gimple_set_loaded_syms (stmt, build_loads, &operands_bitmap_obstack); - - /* If aliases have not been computed, do not instantiate a virtual - operator on STMT. Initially, we only compute the SSA form on - GIMPLE registers. The virtual SSA form is only computed after - alias analysis, so virtual operators will remain unrenamed and - the verifier will complain. However, alias analysis needs to - access symbol load/store information, so we need to compute - those. */ - if (!gimple_aliases_computed_p (cfun)) - return; - - /* STMT should have at most one VUSE operator. */ - old_ops = gimple_vuse_ops (stmt); - gcc_assert (old_ops == NULL || old_ops->next == NULL); - - new_ops = NULL; - new_i = old_i = 0; - while (old_ops - && old_i < VUSE_NUM (old_ops) - && new_i < VEC_length (tree, build_vuses)) - { - tree new_op = VEC_index (tree, build_vuses, new_i); - tree old_op = VUSE_OP (old_ops, old_i); - unsigned new_uid = get_name_decl (new_op); - unsigned old_uid = get_name_decl (old_op); - - if (old_uid == new_uid) - { - /* If the symbols are the same, reuse the existing operand. */ - VEC_safe_push (tree, heap, new_ops, old_op); - new_i++; - old_i++; - } - else if (old_uid < new_uid) - { - /* If OLD_UID is less than NEW_UID, the old operand has - disappeared, skip to the next old operand. */ - old_i++; - } - else - { - /* This is a new operand. */ - VEC_safe_push (tree, heap, new_ops, new_op); - new_i++; - } - } - - /* If there is anything remaining in the build_vuses list, simply emit it. */ - for ( ; new_i < VEC_length (tree, build_vuses); new_i++) - VEC_safe_push (tree, heap, new_ops, VEC_index (tree, build_vuses, new_i)); - - /* If there is anything in the old list, free it. */ - if (old_ops) - { - for (old_i = 0; old_i < VUSE_NUM (old_ops); old_i++) - delink_imm_use (VUSE_OP_PTR (old_ops, old_i)); - add_vop_to_freelist (old_ops); - gimple_set_vuse_ops (stmt, NULL); - } - - /* If there are any operands, instantiate a VUSE operator for STMT. */ - if (new_ops) - { - tree op; - unsigned i; - - last = add_vuse_op (stmt, NULL, VEC_length (tree, new_ops), NULL); - - for (i = 0; VEC_iterate (tree, new_ops, i, op); i++) - SET_USE (VUSE_OP_PTR (last, (int) i), op); - - gimple_set_vuse_ops (stmt, last); - VEC_free (tree, heap, new_ops); - } - -#ifdef ENABLE_CHECKING - { - unsigned x; - - if (gimple_vuse_ops (stmt)) - { - gcc_assert (gimple_vuse_ops (stmt)->next == NULL); - x = VUSE_NUM (gimple_vuse_ops (stmt)); - } - else - x = 0; - - gcc_assert (x == VEC_length (tree, build_vuses)); - } -#endif -} - -/* Return a new VUSE operand vector for STMT. */ - -static void -finalize_ssa_vuses (gimple stmt) -{ - unsigned num, num_vdefs; - unsigned vuse_index; - - /* Remove superfluous VUSE operands. If the statement already has a - VDEF operator for a variable 'a', then a VUSE for 'a' is not - needed because VDEFs imply a VUSE of the variable. For instance, - suppose that variable 'a' is pointed-to by p and q: - - # VUSE <a_2> - # a_3 = VDEF <a_2> - *p = *q; - - The VUSE <a_2> is superfluous because it is implied by the - VDEF operator. */ - num = VEC_length (tree, build_vuses); - num_vdefs = VEC_length (tree, build_vdefs); - - if (num > 0 && num_vdefs > 0) - for (vuse_index = 0; vuse_index < VEC_length (tree, build_vuses); ) - { - tree vuse; - vuse = VEC_index (tree, build_vuses, vuse_index); - if (TREE_CODE (vuse) != SSA_NAME) - { - var_ann_t ann = var_ann (vuse); - ann->in_vuse_list = 0; - if (ann->in_vdef_list) - { - VEC_ordered_remove (tree, build_vuses, vuse_index); - continue; - } - } - vuse_index++; - } - - finalize_ssa_vuse_ops (stmt); } @@ -1024,23 +555,10 @@ finalize_ssa_vuses (gimple stmt) static inline void cleanup_build_arrays (void) { - unsigned i; - tree t; - - for (i = 0; VEC_iterate (tree, build_vdefs, i, t); i++) - if (TREE_CODE (t) != SSA_NAME) - var_ann (t)->in_vdef_list = false; - - for (i = 0; VEC_iterate (tree, build_vuses, i, t); i++) - if (TREE_CODE (t) != SSA_NAME) - var_ann (t)->in_vuse_list = false; - - VEC_truncate (tree, build_vdefs, 0); - VEC_truncate (tree, build_vuses, 0); + build_vdef = NULL_TREE; + build_vuse = NULL_TREE; VEC_truncate (tree, build_defs, 0); VEC_truncate (tree, build_uses, 0); - bitmap_clear (build_loads); - bitmap_clear (build_stores); } @@ -1051,11 +569,6 @@ finalize_ssa_stmt_operands (gimple stmt) { finalize_ssa_defs (stmt); finalize_ssa_uses (stmt); - if (gimple_has_mem_ops (stmt)) - { - finalize_ssa_vdefs (stmt); - finalize_ssa_vuses (stmt); - } cleanup_build_arrays (); } @@ -1067,10 +580,8 @@ start_ssa_stmt_operands (void) { gcc_assert (VEC_length (tree, build_defs) == 0); gcc_assert (VEC_length (tree, build_uses) == 0); - gcc_assert (VEC_length (tree, build_vuses) == 0); - gcc_assert (VEC_length (tree, build_vdefs) == 0); - gcc_assert (bitmap_empty_p (build_loads)); - gcc_assert (bitmap_empty_p (build_stores)); + gcc_assert (build_vuse == NULL_TREE); + gcc_assert (build_vdef == NULL_TREE); } @@ -1097,31 +608,13 @@ append_use (tree *use_p) static inline void append_vdef (tree var) { - tree sym; + gcc_assert ((build_vdef == NULL_TREE + || build_vdef == var) + && (build_vuse == NULL_TREE + || build_vuse == var)); - if (TREE_CODE (var) != SSA_NAME) - { - tree mpt; - var_ann_t ann; - - /* If VAR belongs to a memory partition, use it instead of VAR. */ - mpt = memory_partition (var); - if (mpt) - var = mpt; - - /* Don't allow duplicate entries. */ - ann = get_var_ann (var); - if (ann->in_vdef_list) - return; - - ann->in_vdef_list = true; - sym = var; - } - else - sym = SSA_NAME_VAR (var); - - VEC_safe_push (tree, heap, build_vdefs, var); - bitmap_set_bit (build_stores, DECL_UID (sym)); + build_vdef = var; + build_vuse = var; } @@ -1130,303 +623,27 @@ append_vdef (tree var) static inline void append_vuse (tree var) { - tree sym; + gcc_assert (build_vuse == NULL_TREE + || build_vuse == var); - if (TREE_CODE (var) != SSA_NAME) - { - tree mpt; - var_ann_t ann; - - /* If VAR belongs to a memory partition, use it instead of VAR. */ - mpt = memory_partition (var); - if (mpt) - var = mpt; - - /* Don't allow duplicate entries. */ - ann = get_var_ann (var); - if (ann->in_vuse_list) - return; - else if (ann->in_vdef_list) - { - /* We don't want a vuse if we already have a vdef, but we must - still put this in build_loads. */ - bitmap_set_bit (build_loads, DECL_UID (var)); - return; - } - - ann->in_vuse_list = true; - sym = var; - } - else - sym = SSA_NAME_VAR (var); - - VEC_safe_push (tree, heap, build_vuses, var); - bitmap_set_bit (build_loads, DECL_UID (sym)); + build_vuse = var; } - -/* REF is a tree that contains the entire pointer dereference - expression, if available, or NULL otherwise. ALIAS is the variable - we are asking if REF can access. OFFSET and SIZE come from the - memory access expression that generated this virtual operand. - - XXX: We should handle the NO_ALIAS attributes here. */ - -static bool -access_can_touch_variable (tree ref, tree alias, HOST_WIDE_INT offset, - HOST_WIDE_INT size) -{ - bool offsetgtz = offset > 0; - unsigned HOST_WIDE_INT uoffset = (unsigned HOST_WIDE_INT) offset; - tree base = ref ? get_base_address (ref) : NULL; - - /* If ALIAS is .GLOBAL_VAR then the memory reference REF must be - using a call-clobbered memory tag. By definition, call-clobbered - memory tags can always touch .GLOBAL_VAR. */ - if (alias == gimple_global_var (cfun)) - return true; - - /* If ref is a TARGET_MEM_REF, just return true, as we can't really - disambiguate them right now. */ - if (ref && TREE_CODE (ref) == TARGET_MEM_REF) - return true; - - /* Without strict aliasing, it is impossible for a component access - through a pointer to touch a random variable, unless that - variable *is* a structure or a pointer. - - That is, given p->c, and some random global variable b, - there is no legal way that p->c could be an access to b. - - Without strict aliasing on, we consider it legal to do something - like: - - struct foos { int l; }; - int foo; - static struct foos *getfoo(void); - int main (void) - { - struct foos *f = getfoo(); - f->l = 1; - foo = 2; - if (f->l == 1) - abort(); - exit(0); - } - static struct foos *getfoo(void) - { return (struct foos *)&foo; } - - (taken from 20000623-1.c) - - The docs also say/imply that access through union pointers - is legal (but *not* if you take the address of the union member, - i.e. the inverse), such that you can do - - typedef union { - int d; - } U; - - int rv; - void breakme() - { - U *rv0; - U *pretmp = (U*)&rv; - rv0 = pretmp; - rv0->d = 42; - } - To implement this, we just punt on accesses through union - pointers entirely. - - Another case we have to allow is accessing a variable - through an array access at offset zero. This happens from - code generated by the fortran frontend like - - char[1:1] & my_char_ref; - char my_char; - my_char_ref_1 = (char[1:1] &) &my_char; - D.874_2 = (*my_char_ref_1)[1]{lb: 1 sz: 1}; - */ - if (ref - && flag_strict_aliasing - && TREE_CODE (ref) != INDIRECT_REF - && !MTAG_P (alias) - && base - && (TREE_CODE (base) != INDIRECT_REF - || TREE_CODE (TREE_TYPE (base)) != UNION_TYPE) - && (TREE_CODE (base) != INDIRECT_REF - || TREE_CODE (ref) != ARRAY_REF - || offset != 0 - || (DECL_SIZE (alias) - && TREE_CODE (DECL_SIZE (alias)) == INTEGER_CST - && size != -1 - && (unsigned HOST_WIDE_INT)size - != TREE_INT_CST_LOW (DECL_SIZE (alias)))) - && !AGGREGATE_TYPE_P (TREE_TYPE (alias)) - && TREE_CODE (TREE_TYPE (alias)) != COMPLEX_TYPE - && !var_ann (alias)->is_heapvar - /* When the struct has may_alias attached to it, we need not to - return true. */ - && get_alias_set (base)) - { -#ifdef ACCESS_DEBUGGING - fprintf (stderr, "Access to "); - print_generic_expr (stderr, ref, 0); - fprintf (stderr, " may not touch "); - print_generic_expr (stderr, alias, 0); - fprintf (stderr, " in function %s\n", get_name (current_function_decl)); -#endif - return false; - } - - /* If the offset of the access is greater than the size of one of - the possible aliases, it can't be touching that alias, because it - would be past the end of the structure. */ - else if (ref - && flag_strict_aliasing - && TREE_CODE (ref) != INDIRECT_REF - && !MTAG_P (alias) - && !var_ann (alias)->is_heapvar - && !POINTER_TYPE_P (TREE_TYPE (alias)) - && offsetgtz - && DECL_SIZE (alias) - && TREE_CODE (DECL_SIZE (alias)) == INTEGER_CST - && uoffset >= TREE_INT_CST_LOW (DECL_SIZE (alias))) - { -#ifdef ACCESS_DEBUGGING - fprintf (stderr, "Access to "); - print_generic_expr (stderr, ref, 0); - fprintf (stderr, " may not touch "); - print_generic_expr (stderr, alias, 0); - fprintf (stderr, " in function %s\n", get_name (current_function_decl)); -#endif - return false; - } - - return true; -} - -/* Add VAR to the virtual operands for STMT. FLAGS is as in - get_expr_operands. FULL_REF is a tree that contains the entire - pointer dereference expression, if available, or NULL otherwise. - OFFSET and SIZE come from the memory access expression that - generated this virtual operand. IS_CALL_SITE is true if the - affected statement is a call site. */ +/* Add virtual operands for STMT. FLAGS is as in get_expr_operands. */ static void -add_virtual_operand (tree var, gimple stmt, int flags, - tree full_ref, HOST_WIDE_INT offset, - HOST_WIDE_INT size, bool is_call_site) +add_virtual_operand (gimple stmt ATTRIBUTE_UNUSED, int flags) { - bitmap aliases = NULL; - tree sym; - var_ann_t v_ann; - - sym = (TREE_CODE (var) == SSA_NAME ? SSA_NAME_VAR (var) : var); - v_ann = var_ann (sym); - - /* Mark the statement as having memory operands. */ - gimple_set_references_memory (stmt, true); - - /* If the variable cannot be modified and this is a VDEF change - it into a VUSE. This happens when read-only variables are marked - call-clobbered and/or aliased to writable variables. So we only - check that this only happens on non-specific stores. - - Note that if this is a specific store, i.e. associated with a - MODIFY_EXPR, then we can't suppress the VDEF, lest we run - into validation problems. - - This can happen when programs cast away const, leaving us with a - store to read-only memory. If the statement is actually executed - at runtime, then the program is ill formed. If the statement is - not executed then all is well. At the very least, we cannot ICE. */ - if ((flags & opf_implicit) && unmodifiable_var_p (var)) - flags &= ~opf_def; - - /* The variable is not a GIMPLE register. Add it (or its aliases) to - virtual operands, unless the caller has specifically requested - not to add virtual operands (used when adding operands inside an + /* Add virtual operands to the stmt, unless the caller has specifically + requested not to do that (used when adding operands inside an ADDR_EXPR expression). */ if (flags & opf_no_vops) return; - - if (MTAG_P (var)) - aliases = MTAG_ALIASES (var); - - if (aliases == NULL) - { - if (!gimple_aliases_computed_p (cfun) && (flags & opf_def)) - gimple_set_has_volatile_ops (stmt, true); - /* The variable is not aliased or it is an alias tag. */ - if (flags & opf_def) - append_vdef (var); - else - append_vuse (var); - } + if (flags & opf_def) + append_vdef (gimple_vop (cfun)); else - { - bitmap_iterator bi; - unsigned int i; - bool none_added = true; - - /* The variable is aliased. Add its aliases to the virtual - operands. */ - gcc_assert (!bitmap_empty_p (aliases)); - - EXECUTE_IF_SET_IN_BITMAP (aliases, 0, i, bi) - { - tree al = referenced_var (i); - - /* Call-clobbered tags may have non-call-clobbered - symbols in their alias sets. Ignore them if we are - adding VOPs for a call site. */ - if (is_call_site && !is_call_clobbered (al)) - continue; - - /* If we do not know the full reference tree or if the access is - unspecified [0, -1], we cannot prune it. Otherwise try doing - so using access_can_touch_variable. */ - if (full_ref - && !access_can_touch_variable (full_ref, al, offset, size)) - continue; - - if (flags & opf_def) - append_vdef (al); - else - append_vuse (al); - none_added = false; - } - - if (flags & opf_def) - { - /* If the variable is also an alias tag, add a virtual - operand for it, otherwise we will miss representing - references to the members of the variable's alias set. - This fixes the bug in gcc.c-torture/execute/20020503-1.c. - - It is also necessary to add bare defs on clobbers for - SMT's, so that bare SMT uses caused by pruning all the - aliases will link up properly with calls. In order to - keep the number of these bare defs we add down to the - minimum necessary, we keep track of which SMT's were used - alone in statement vdefs or VUSEs. */ - if (none_added - || (TREE_CODE (var) == SYMBOL_MEMORY_TAG - && is_call_site)) - append_vdef (var); - } - else - { - /* Even if no aliases have been added, we still need to - establish def-use and use-def chains, lest - transformations think that this is not a memory - reference. For an example of this scenario, see - testsuite/g++.dg/opt/cleanup1.C. */ - if (none_added) - append_vuse (var); - } - } + append_vuse (gimple_vop (cfun)); } @@ -1460,106 +677,43 @@ add_stmt_operand (tree *var_p, gimple stmt, int flags) append_use (var_p); } else - add_virtual_operand (var, stmt, flags, NULL_TREE, 0, -1, false); + add_virtual_operand (stmt, flags); } -/* Subroutine of get_indirect_ref_operands. ADDR is the address - that is dereferenced, the meaning of the rest of the arguments - is the same as in get_indirect_ref_operands. */ +/* Add the base address of REF to SET. */ static void -get_addr_dereference_operands (gimple stmt, tree *addr, int flags, - tree full_ref, HOST_WIDE_INT offset, - HOST_WIDE_INT size, bool recurse_on_base) +add_to_addressable_set (tree ref, bitmap *set) { - tree ptr = *addr; - - /* Mark the statement as having memory operands. */ - gimple_set_references_memory (stmt, true); + tree var; - if (SSA_VAR_P (ptr)) + /* Note that it is *NOT OKAY* to use the target of a COMPONENT_REF + as the only thing we take the address of. If VAR is a structure, + taking the address of a field means that the whole structure may + be referenced using pointer arithmetic. See PR 21407 and the + ensuing mailing list discussion. */ + var = get_base_address (ref); + if (var && SSA_VAR_P (var)) { - struct ptr_info_def *pi = NULL; + if (*set == NULL) + *set = BITMAP_ALLOC (&operands_bitmap_obstack); - /* If PTR has flow-sensitive points-to information, use it. */ - if (TREE_CODE (ptr) == SSA_NAME - && (pi = SSA_NAME_PTR_INFO (ptr)) != NULL - && pi->name_mem_tag) - { - /* PTR has its own memory tag. Use it. */ - add_virtual_operand (pi->name_mem_tag, stmt, flags, - full_ref, offset, size, false); - } - else - { - /* If PTR is not an SSA_NAME or it doesn't have a name - tag, use its symbol memory tag. */ - var_ann_t v_ann; - - /* If we are emitting debugging dumps, display a warning if - PTR is an SSA_NAME with no flow-sensitive alias - information. That means that we may need to compute - aliasing again or that a propagation pass forgot to - update the alias information on the pointers. */ - if (dump_file - && TREE_CODE (ptr) == SSA_NAME - && (pi == NULL - || (pi->name_mem_tag == NULL_TREE - && !pi->pt_anything)) - && gimple_aliases_computed_p (cfun)) - { - fprintf (dump_file, - "NOTE: no flow-sensitive alias info for "); - print_generic_expr (dump_file, ptr, dump_flags); - fprintf (dump_file, " in "); - print_gimple_stmt (dump_file, stmt, 0, 0); - } - - if (TREE_CODE (ptr) == SSA_NAME) - ptr = SSA_NAME_VAR (ptr); - v_ann = var_ann (ptr); - - /* If we don't know what this pointer points to then we have - to make sure to not prune virtual operands based on offset - and size. */ - if (v_ann->symbol_mem_tag) - { - add_virtual_operand (v_ann->symbol_mem_tag, stmt, flags, - full_ref, 0, -1, false); - /* Make sure we add the SMT itself. */ - if (!(flags & opf_no_vops)) - { - if (flags & opf_def) - append_vdef (v_ann->symbol_mem_tag); - else - append_vuse (v_ann->symbol_mem_tag); - } - } - - /* Aliasing information is missing; mark statement as - volatile so we won't optimize it out too actively. */ - else if (!gimple_aliases_computed_p (cfun) - && (flags & opf_def)) - gimple_set_has_volatile_ops (stmt, true); - } - } - else if (TREE_CODE (ptr) == INTEGER_CST) - { - /* If a constant is used as a pointer, we can't generate a real - operand for it but we mark the statement volatile to prevent - optimizations from messing things up. */ - gimple_set_has_volatile_ops (stmt, true); - return; - } - else - { - /* Ok, this isn't even is_gimple_min_invariant. Something's broke. */ - gcc_unreachable (); + bitmap_set_bit (*set, DECL_UID (var)); + TREE_ADDRESSABLE (var) = 1; } +} - /* If requested, add a USE operand for the base pointer. */ - if (recurse_on_base) - get_expr_operands (stmt, addr, opf_use); +/* Add the base address of REF to the set of addresses taken by STMT. + REF may be a single variable whose address has been taken or any + other valid GIMPLE memory reference (structure reference, array, + etc). If the base address of REF is a decl that has sub-variables, + also add all of its sub-variables. */ + +static void +gimple_add_to_addresses_taken (gimple stmt, tree ref) +{ + gcc_assert (gimple_has_ops (stmt)); + add_to_addressable_set (ref, gimple_addresses_taken_ptr (stmt)); } @@ -1571,19 +725,12 @@ get_addr_dereference_operands (gimple stmt, tree *addr, int flags, FLAGS is as in get_expr_operands. - FULL_REF contains the full pointer dereference expression, if we - have it, or NULL otherwise. - - OFFSET and SIZE are the location of the access inside the - dereferenced pointer, if known. - RECURSE_ON_BASE should be set to true if we want to continue calling get_expr_operands on the base pointer, and false if something else will do it for us. */ static void -get_indirect_ref_operands (gimple stmt, tree expr, int flags, tree full_ref, - HOST_WIDE_INT offset, HOST_WIDE_INT size, +get_indirect_ref_operands (gimple stmt, tree expr, int flags, bool recurse_on_base) { tree *pptr = &TREE_OPERAND (expr, 0); @@ -1591,8 +738,12 @@ get_indirect_ref_operands (gimple stmt, tree expr, int flags, tree full_ref, if (TREE_THIS_VOLATILE (expr)) gimple_set_has_volatile_ops (stmt, true); - get_addr_dereference_operands (stmt, pptr, flags, full_ref, offset, size, - recurse_on_base); + /* Add the VOP. */ + add_virtual_operand (stmt, flags); + + /* If requested, add a USE operand for the base pointer. */ + if (recurse_on_base) + get_expr_operands (stmt, pptr, opf_use); } @@ -1601,11 +752,6 @@ get_indirect_ref_operands (gimple stmt, tree expr, int flags, tree full_ref, static void get_tmr_operands (gimple stmt, tree expr, int flags) { - tree tag; - - /* Mark the statement as having memory operands. */ - gimple_set_references_memory (stmt, true); - /* First record the real operands. */ get_expr_operands (stmt, &TMR_BASE (expr), opf_use); get_expr_operands (stmt, &TMR_INDEX (expr), opf_use); @@ -1613,154 +759,7 @@ get_tmr_operands (gimple stmt, tree expr, int flags) if (TMR_SYMBOL (expr)) gimple_add_to_addresses_taken (stmt, TMR_SYMBOL (expr)); - tag = TMR_TAG (expr); - if (!tag) - { - /* Something weird, so ensure that we will be careful. */ - gimple_set_has_volatile_ops (stmt, true); - return; - } - if (!MTAG_P (tag)) - { - get_expr_operands (stmt, &tag, flags); - return; - } - - add_virtual_operand (tag, stmt, flags, expr, 0, -1, false); -} - - -/* Add clobbering definitions for .GLOBAL_VAR or for each of the call - clobbered variables in the function. */ - -static void -add_call_clobber_ops (gimple stmt, tree callee ATTRIBUTE_UNUSED) -{ - unsigned u; - bitmap_iterator bi; - bitmap not_read_b, not_written_b; - - gcc_assert (!(gimple_call_flags (stmt) & (ECF_PURE | ECF_CONST))); - - /* If we created .GLOBAL_VAR earlier, just use it. */ - if (gimple_global_var (cfun)) - { - tree var = gimple_global_var (cfun); - add_virtual_operand (var, stmt, opf_def, NULL, 0, -1, true); - return; - } - - /* Get info for local and module level statics. There is a bit - set for each static if the call being processed does not read - or write that variable. */ - not_read_b = callee ? ipa_reference_get_not_read_global (cgraph_node (callee)) : NULL; - not_written_b = callee ? ipa_reference_get_not_written_global (cgraph_node (callee)) : NULL; - - /* Add a VDEF operand for every call clobbered variable. */ - EXECUTE_IF_SET_IN_BITMAP (gimple_call_clobbered_vars (cfun), 0, u, bi) - { - tree var = referenced_var_lookup (u); - tree real_var = var; - bool not_read; - bool not_written; - - not_read = not_read_b - ? bitmap_bit_p (not_read_b, DECL_UID (real_var)) - : false; - - not_written = not_written_b - ? bitmap_bit_p (not_written_b, DECL_UID (real_var)) - : false; - gcc_assert (!unmodifiable_var_p (var)); - - clobber_stats.clobbered_vars++; - - /* See if this variable is really clobbered by this function. */ - - if (not_written) - { - clobber_stats.static_write_clobbers_avoided++; - if (!not_read) - add_virtual_operand (var, stmt, opf_use, NULL, 0, -1, true); - else - clobber_stats.static_read_clobbers_avoided++; - } - else - add_virtual_operand (var, stmt, opf_def, NULL, 0, -1, true); - } -} - - -/* Add VUSE operands for .GLOBAL_VAR or all call clobbered variables in the - function. */ - -static void -add_call_read_ops (gimple stmt, tree callee ATTRIBUTE_UNUSED) -{ - unsigned u; - bitmap_iterator bi; - bitmap not_read_b; - - /* Const functions do not reference memory. */ - if (gimple_call_flags (stmt) & ECF_CONST) - return; - - not_read_b = callee ? ipa_reference_get_not_read_global (cgraph_node (callee)) : NULL; - - /* For pure functions we compute non-escaped uses separately. */ - if (gimple_call_flags (stmt) & ECF_PURE) - EXECUTE_IF_SET_IN_BITMAP (gimple_call_used_vars (cfun), 0, u, bi) - { - tree var = referenced_var_lookup (u); - tree real_var = var; - bool not_read; - - if (unmodifiable_var_p (var)) - continue; - - not_read = not_read_b - ? bitmap_bit_p (not_read_b, DECL_UID (real_var)) - : false; - - clobber_stats.readonly_clobbers++; - - /* See if this variable is really used by this function. */ - if (!not_read) - add_virtual_operand (var, stmt, opf_use, NULL, 0, -1, true); - else - clobber_stats.static_readonly_clobbers_avoided++; - } - - /* Add a VUSE for .GLOBAL_VAR if it has been created. See - add_referenced_var for the heuristic used to decide whether to - create .GLOBAL_VAR. */ - if (gimple_global_var (cfun)) - { - tree var = gimple_global_var (cfun); - add_virtual_operand (var, stmt, opf_use, NULL, 0, -1, true); - return; - } - - /* Add a VUSE for each call-clobbered variable. */ - EXECUTE_IF_SET_IN_BITMAP (gimple_call_clobbered_vars (cfun), 0, u, bi) - { - tree var = referenced_var (u); - tree real_var = var; - bool not_read; - - clobber_stats.readonly_clobbers++; - - not_read = not_read_b ? bitmap_bit_p (not_read_b, DECL_UID (real_var)) - : false; - - if (not_read) - { - clobber_stats.static_readonly_clobbers_avoided++; - continue; - } - - add_virtual_operand (var, stmt, opf_use, NULL, 0, -1, true); - } + add_virtual_operand (stmt, flags); } @@ -1768,25 +767,22 @@ add_call_read_ops (gimple stmt, tree callee ATTRIBUTE_UNUSED) escape, add them to the VDEF/VUSE lists for it. */ static void -maybe_add_call_clobbered_vops (gimple stmt) +maybe_add_call_vops (gimple stmt) { int call_flags = gimple_call_flags (stmt); - /* Mark the statement as having memory operands. */ - gimple_set_references_memory (stmt, true); - /* If aliases have been computed already, add VDEF or VUSE operands for all the symbols that have been found to be call-clobbered. */ - if (gimple_aliases_computed_p (cfun) && !(call_flags & ECF_NOVOPS)) + if (!(call_flags & ECF_NOVOPS)) { /* A 'pure' or a 'const' function never call-clobbers anything. A 'noreturn' function might, but since we don't return anyway there is no point in recording that. */ if (!(call_flags & (ECF_PURE | ECF_CONST | ECF_NORETURN))) - add_call_clobber_ops (stmt, gimple_call_fndecl (stmt)); + add_virtual_operand (stmt, opf_def); else if (!(call_flags & ECF_CONST)) - add_call_read_ops (stmt, gimple_call_fndecl (stmt)); + add_virtual_operand (stmt, opf_use); } } @@ -1854,23 +850,7 @@ get_asm_expr_operands (gimple stmt) tree link = gimple_asm_clobber_op (stmt, i); if (strcmp (TREE_STRING_POINTER (TREE_VALUE (link)), "memory") == 0) { - unsigned i; - bitmap_iterator bi; - - /* Mark the statement as having memory operands. */ - gimple_set_references_memory (stmt, true); - - EXECUTE_IF_SET_IN_BITMAP (gimple_call_clobbered_vars (cfun), 0, i, bi) - { - tree var = referenced_var (i); - add_stmt_operand (&var, stmt, opf_def | opf_implicit); - } - - EXECUTE_IF_SET_IN_BITMAP (gimple_addressable_vars (cfun), 0, i, bi) - { - tree var = referenced_var (i); - add_stmt_operand (&var, stmt, opf_def | opf_implicit); - } + add_virtual_operand (stmt, opf_def); break; } } @@ -1918,8 +898,6 @@ get_expr_operands (gimple stmt, tree *expr_p, int flags) return; case SSA_NAME: - case SYMBOL_MEMORY_TAG: - case NAME_MEMORY_TAG: add_stmt_operand (expr_p, stmt, flags); return; @@ -1935,7 +913,7 @@ get_expr_operands (gimple stmt, tree *expr_p, int flags) case ALIGN_INDIRECT_REF: case INDIRECT_REF: - get_indirect_ref_operands (stmt, expr, flags, expr, 0, -1, true); + get_indirect_ref_operands (stmt, expr, flags, true); return; case TARGET_MEM_REF: @@ -1957,8 +935,7 @@ get_expr_operands (gimple stmt, tree *expr_p, int flags) ref = get_ref_base_and_extent (expr, &offset, &size, &maxsize); if (TREE_CODE (ref) == INDIRECT_REF) { - get_indirect_ref_operands (stmt, ref, flags, expr, offset, - maxsize, false); + get_indirect_ref_operands (stmt, ref, flags, false); flags |= opf_no_vops; } @@ -2098,7 +1075,7 @@ parse_ssa_operands (gimple stmt) /* Add call-clobbered operands, if needed. */ if (code == GIMPLE_CALL) - maybe_add_call_clobbered_vops (stmt); + maybe_add_call_vops (stmt); } } @@ -2111,7 +1088,6 @@ build_ssa_operands (gimple stmt) /* Initially assume that the statement has no volatile operands and makes no memory references. */ gimple_set_has_volatile_ops (stmt, false); - gimple_set_references_memory (stmt, false); /* Just clear the bitmap so we don't end up reallocating it over and over. */ if (gimple_addresses_taken (stmt)) @@ -2119,14 +1095,7 @@ build_ssa_operands (gimple stmt) start_ssa_stmt_operands (); parse_ssa_operands (stmt); - operand_build_sort_virtual (build_vuses); - operand_build_sort_virtual (build_vdefs); finalize_ssa_stmt_operands (stmt); - - /* For added safety, assume that statements with volatile operands - also reference memory. */ - if (gimple_has_volatile_ops (stmt)) - gimple_set_references_memory (stmt, true); } @@ -2138,9 +1107,6 @@ free_stmt_operands (gimple stmt) { def_optype_p defs = gimple_def_ops (stmt), last_def; use_optype_p uses = gimple_use_ops (stmt), last_use; - voptype_p vuses = gimple_vuse_ops (stmt); - voptype_p vdefs = gimple_vdef_ops (stmt), vdef, next_vdef; - unsigned i; if (defs) { @@ -2161,32 +1127,13 @@ free_stmt_operands (gimple stmt) gimple_set_use_ops (stmt, NULL); } - if (vuses) - { - for (i = 0; i < VUSE_NUM (vuses); i++) - delink_imm_use (VUSE_OP_PTR (vuses, i)); - add_vop_to_freelist (vuses); - gimple_set_vuse_ops (stmt, NULL); - } - - if (vdefs) - { - for (vdef = vdefs; vdef; vdef = next_vdef) - { - next_vdef = vdef->next; - delink_imm_use (VDEF_OP_PTR (vdef, 0)); - add_vop_to_freelist (vdef); - } - gimple_set_vdef_ops (stmt, NULL); - } - if (gimple_has_ops (stmt)) gimple_set_addresses_taken (stmt, NULL); if (gimple_has_mem_ops (stmt)) { - gimple_set_stored_syms (stmt, NULL, &operands_bitmap_obstack); - gimple_set_loaded_syms (stmt, NULL, &operands_bitmap_obstack); + gimple_set_vuse (stmt, NULL_TREE); + gimple_set_vdef (stmt, NULL_TREE); } } @@ -2211,113 +1158,6 @@ update_stmt_operands (gimple stmt) } -/* Copies virtual operands from SRC to DST. */ - -void -copy_virtual_operands (gimple dest, gimple src) -{ - unsigned int i, n; - voptype_p src_vuses, dest_vuses; - voptype_p src_vdefs, dest_vdefs; - struct voptype_d vuse; - struct voptype_d vdef; - - if (!gimple_has_mem_ops (src)) - return; - - gimple_set_vdef_ops (dest, NULL); - gimple_set_vuse_ops (dest, NULL); - - gimple_set_stored_syms (dest, gimple_stored_syms (src), - &operands_bitmap_obstack); - gimple_set_loaded_syms (dest, gimple_loaded_syms (src), - &operands_bitmap_obstack); - - /* Copy all the VUSE operators and corresponding operands. */ - dest_vuses = &vuse; - for (src_vuses = gimple_vuse_ops (src); - src_vuses; - src_vuses = src_vuses->next) - { - n = VUSE_NUM (src_vuses); - dest_vuses = add_vuse_op (dest, NULL_TREE, n, dest_vuses); - for (i = 0; i < n; i++) - SET_USE (VUSE_OP_PTR (dest_vuses, i), VUSE_OP (src_vuses, i)); - - if (gimple_vuse_ops (dest) == NULL) - gimple_set_vuse_ops (dest, vuse.next); - } - - /* Copy all the VDEF operators and corresponding operands. */ - dest_vdefs = &vdef; - for (src_vdefs = gimple_vdef_ops (src); - src_vdefs; - src_vdefs = src_vdefs->next) - { - n = VUSE_NUM (src_vdefs); - dest_vdefs = add_vdef_op (dest, NULL_TREE, n, dest_vdefs); - VDEF_RESULT (dest_vdefs) = VDEF_RESULT (src_vdefs); - for (i = 0; i < n; i++) - SET_USE (VUSE_OP_PTR (dest_vdefs, i), VUSE_OP (src_vdefs, i)); - - if (gimple_vdef_ops (dest) == NULL) - gimple_set_vdef_ops (dest, vdef.next); - } -} - - -/* Specifically for use in DOM's expression analysis. Given a store, we - create an artificial stmt which looks like a load from the store, this can - be used to eliminate redundant loads. OLD_OPS are the operands from the - store stmt, and NEW_STMT is the new load which represents a load of the - values stored. If DELINK_IMM_USES_P is specified, the immediate - uses of this stmt will be de-linked. */ - -void -create_ssa_artificial_load_stmt (gimple new_stmt, gimple old_stmt, - bool delink_imm_uses_p) -{ - tree op; - ssa_op_iter iter; - use_operand_p use_p; - unsigned i; - - gimple_set_modified (new_stmt, false); - - /* Process NEW_STMT looking for operands. */ - start_ssa_stmt_operands (); - parse_ssa_operands (new_stmt); - - for (i = 0; VEC_iterate (tree, build_vuses, i, op); i++) - if (TREE_CODE (op) != SSA_NAME) - var_ann (op)->in_vuse_list = false; - - for (i = 0; VEC_iterate (tree, build_vdefs, i, op); i++) - if (TREE_CODE (op) != SSA_NAME) - var_ann (op)->in_vdef_list = false; - - /* Remove any virtual operands that were found. */ - VEC_truncate (tree, build_vdefs, 0); - VEC_truncate (tree, build_vuses, 0); - - /* Clear the loads and stores bitmaps. */ - bitmap_clear (build_loads); - bitmap_clear (build_stores); - - /* For each VDEF on the original statement, we want to create a - VUSE of the VDEF result operand on the new statement. */ - FOR_EACH_SSA_TREE_OPERAND (op, old_stmt, iter, SSA_OP_VDEF) - append_vuse (op); - - finalize_ssa_stmt_operands (new_stmt); - - /* All uses in this fake stmt must not be in the immediate use lists. */ - if (delink_imm_uses_p) - FOR_EACH_SSA_USE_OPERAND (use_p, new_stmt, iter, SSA_OP_ALL_USES) - delink_imm_use (use_p); -} - - /* Swap operands EXP0 and EXP1 in statement STMT. No attempt is done to test the validity of the swap operation. */ @@ -2366,43 +1206,6 @@ swap_tree_operands (gimple stmt, tree *exp0, tree *exp1) *exp1 = op0; } -/* Add the base address of REF to SET. */ - -void -add_to_addressable_set (tree ref, bitmap *set) -{ - tree var; - - /* Note that it is *NOT OKAY* to use the target of a COMPONENT_REF - as the only thing we take the address of. If VAR is a structure, - taking the address of a field means that the whole structure may - be referenced using pointer arithmetic. See PR 21407 and the - ensuing mailing list discussion. */ - var = get_base_address (ref); - if (var && SSA_VAR_P (var)) - { - if (*set == NULL) - *set = BITMAP_ALLOC (&operands_bitmap_obstack); - - bitmap_set_bit (*set, DECL_UID (var)); - TREE_ADDRESSABLE (var) = 1; - } -} - - -/* Add the base address of REF to the set of addresses taken by STMT. - REF may be a single variable whose address has been taken or any - other valid GIMPLE memory reference (structure reference, array, - etc). If the base address of REF is a decl that has sub-variables, - also add all of its sub-variables. */ - -void -gimple_add_to_addresses_taken (gimple stmt, tree ref) -{ - gcc_assert (gimple_has_ops (stmt)); - add_to_addressable_set (ref, gimple_addresses_taken_ptr (stmt)); -} - /* Scan the immediate_use list for VAR making sure its linked properly. Return TRUE if there is a problem and emit an error message to F. */ @@ -2547,192 +1350,94 @@ debug_immediate_uses_for (tree var) } -/* Create a new change buffer for the statement pointed by STMT_P and - push the buffer into SCB_STACK. Each change buffer - records state information needed to determine what changed in the - statement. Mainly, this keeps track of symbols that may need to be - put into SSA form, SSA name replacements and other information - needed to keep the SSA form up to date. */ +/* Push *STMT_P on the SCB_STACK. This function is deprecated, do not + introduce new uses of it. */ void push_stmt_changes (gimple *stmt_p) { - gimple stmt; - scb_t buf; - - stmt = *stmt_p; + gimple stmt = *stmt_p; /* It makes no sense to keep track of PHI nodes. */ if (gimple_code (stmt) == GIMPLE_PHI) return; - buf = XNEW (struct scb_d); - memset (buf, 0, sizeof *buf); - - buf->stmt_p = stmt_p; - - if (gimple_references_memory_p (stmt)) - { - tree op; - ssa_op_iter i; - - FOR_EACH_SSA_TREE_OPERAND (op, stmt, i, SSA_OP_VUSE) - { - tree sym = TREE_CODE (op) == SSA_NAME ? SSA_NAME_VAR (op) : op; - if (buf->loads == NULL) - buf->loads = BITMAP_ALLOC (NULL); - bitmap_set_bit (buf->loads, DECL_UID (sym)); - } - - FOR_EACH_SSA_TREE_OPERAND (op, stmt, i, SSA_OP_VDEF) - { - tree sym = TREE_CODE (op) == SSA_NAME ? SSA_NAME_VAR (op) : op; - if (buf->stores == NULL) - buf->stores = BITMAP_ALLOC (NULL); - bitmap_set_bit (buf->stores, DECL_UID (sym)); - } - } - - VEC_safe_push (scb_t, heap, scb_stack, buf); -} - - -/* Given two sets S1 and S2, mark the symbols that differ in S1 and S2 - for renaming. The set to mark for renaming is (S1 & ~S2) | (S2 & ~S1). */ - -static void -mark_difference_for_renaming (bitmap s1, bitmap s2) -{ - if (s1 == NULL && s2 == NULL) - return; - - if (s1 && s2 == NULL) - mark_set_for_renaming (s1); - else if (s1 == NULL && s2) - mark_set_for_renaming (s2); - else if (!bitmap_equal_p (s1, s2)) - { - bitmap t1 = BITMAP_ALLOC (NULL); - bitmap_xor (t1, s1, s2); - mark_set_for_renaming (t1); - BITMAP_FREE (t1); - } + VEC_safe_push (gimple_p, heap, scb_stack, stmt_p); } - -/* Pop the top SCB from SCB_STACK and act on the differences between +/* Pop the top stmt from SCB_STACK and act on the differences between what was recorded by push_stmt_changes and the current state of - the statement. */ + the statement. This function is deprecated, do not introduce + new uses of it. */ void pop_stmt_changes (gimple *stmt_p) { - tree op; - gimple stmt; + gimple *stmt2_p, stmt = *stmt_p; ssa_op_iter iter; - bitmap loads, stores; - scb_t buf; - - stmt = *stmt_p; + tree op; /* It makes no sense to keep track of PHI nodes. */ if (gimple_code (stmt) == GIMPLE_PHI) return; - buf = VEC_pop (scb_t, scb_stack); - gcc_assert (stmt_p == buf->stmt_p); + stmt2_p = VEC_pop (gimple_p, scb_stack); + gcc_assert (stmt_p == stmt2_p); /* Force an operand re-scan on the statement and mark any newly - exposed variables. */ + exposed variables. This also will mark the virtual operand + for renaming if necessary. */ update_stmt (stmt); - /* Determine whether any memory symbols need to be renamed. If the - sets of loads and stores are different after the statement is - modified, then the affected symbols need to be renamed. - - Note that it may be possible for the statement to not reference - memory anymore, but we still need to act on the differences in - the sets of symbols. */ - loads = stores = NULL; - if (gimple_references_memory_p (stmt)) - { - tree op; - ssa_op_iter i; - - FOR_EACH_SSA_TREE_OPERAND (op, stmt, i, SSA_OP_VUSE) - { - tree sym = TREE_CODE (op) == SSA_NAME ? SSA_NAME_VAR (op) : op; - if (loads == NULL) - loads = BITMAP_ALLOC (NULL); - bitmap_set_bit (loads, DECL_UID (sym)); - } - - FOR_EACH_SSA_TREE_OPERAND (op, stmt, i, SSA_OP_VDEF) - { - tree sym = TREE_CODE (op) == SSA_NAME ? SSA_NAME_VAR (op) : op; - if (stores == NULL) - stores = BITMAP_ALLOC (NULL); - bitmap_set_bit (stores, DECL_UID (sym)); - } - } - - /* If LOADS is different from BUF->LOADS, the affected - symbols need to be marked for renaming. */ - mark_difference_for_renaming (loads, buf->loads); - - /* Similarly for STORES and BUF->STORES. */ - mark_difference_for_renaming (stores, buf->stores); - - /* Mark all the naked GIMPLE register operands for renaming. */ + /* Mark all the naked GIMPLE register operands for renaming. + ??? Especially this is considered bad behavior of the caller, + it should have updated SSA form manually. Even more so as + we do not have a way to verify that no SSA names for op are + already in use. */ FOR_EACH_SSA_TREE_OPERAND (op, stmt, iter, SSA_OP_DEF|SSA_OP_USE) if (DECL_P (op)) mark_sym_for_renaming (op); - - /* FIXME, need to add more finalizers here. Cleanup EH info, - recompute invariants for address expressions, add - SSA replacement mappings, etc. For instance, given - testsuite/gcc.c-torture/compile/pr16808.c, we fold a statement of - the form: - - # SMT.4_20 = VDEF <SMT.4_16> - D.1576_11 = 1.0e+0; - - So, the VDEF will disappear, but instead of marking SMT.4 for - renaming it would be far more efficient to establish a - replacement mapping that would replace every reference of - SMT.4_20 with SMT.4_16. */ - - /* Free memory used by the buffer. */ - BITMAP_FREE (buf->loads); - BITMAP_FREE (buf->stores); - BITMAP_FREE (loads); - BITMAP_FREE (stores); - buf->stmt_p = NULL; - free (buf); } - -/* Discard the topmost change buffer from SCB_STACK. This is useful +/* Discard the topmost stmt from SCB_STACK. This is useful when the caller realized that it did not actually modified the - statement. It avoids the expensive operand re-scan. */ + statement. It avoids the expensive operand re-scan. + This function is deprecated, do not introduce new uses of it. */ void discard_stmt_changes (gimple *stmt_p) { - scb_t buf; - gimple stmt; + gimple *stmt2_p, stmt = *stmt_p; /* It makes no sense to keep track of PHI nodes. */ - stmt = *stmt_p; if (gimple_code (stmt) == GIMPLE_PHI) return; - buf = VEC_pop (scb_t, scb_stack); - gcc_assert (stmt_p == buf->stmt_p); + stmt2_p = VEC_pop (gimple_p, scb_stack); + gcc_assert (stmt_p == stmt2_p); +} + +/* Unlink STMTs virtual definition from the IL by propagating its use. */ + +void +unlink_stmt_vdef (gimple stmt) +{ + use_operand_p use_p; + imm_use_iterator iter; + gimple use_stmt; + tree vdef = gimple_vdef (stmt); + + if (!vdef + || TREE_CODE (vdef) != SSA_NAME) + return; + + FOR_EACH_IMM_USE_STMT (use_stmt, iter, gimple_vdef (stmt)) + { + FOR_EACH_IMM_USE_ON_STMT (use_p, iter) + SET_USE (use_p, gimple_vuse (stmt)); + } - /* Free memory used by the buffer. */ - BITMAP_FREE (buf->loads); - BITMAP_FREE (buf->stores); - buf->stmt_p = NULL; - free (buf); + if (SSA_NAME_OCCURS_IN_ABNORMAL_PHI (gimple_vdef (stmt))) + SSA_NAME_OCCURS_IN_ABNORMAL_PHI (gimple_vuse (stmt)) = 1; } + |